Advertisement
Australia markets closed
  • ALL ORDS

    7,817.40
    -81.50 (-1.03%)
     
  • ASX 200

    7,567.30
    -74.80 (-0.98%)
     
  • AUD/USD

    0.6418
    -0.0007 (-0.11%)
     
  • OIL

    83.05
    +0.32 (+0.39%)
     
  • GOLD

    2,399.20
    +1.20 (+0.05%)
     
  • Bitcoin AUD

    100,652.90
    +4,894.97 (+5.11%)
     
  • CMC Crypto 200

    1,330.22
    +17.59 (+1.34%)
     
  • AUD/EUR

    0.6023
    -0.0008 (-0.13%)
     
  • AUD/NZD

    1.0889
    +0.0014 (+0.13%)
     
  • NZX 50

    11,796.21
    -39.83 (-0.34%)
     
  • NASDAQ

    17,394.31
    -99.31 (-0.57%)
     
  • FTSE

    7,818.21
    -58.84 (-0.75%)
     
  • Dow Jones

    37,775.38
    +22.07 (+0.06%)
     
  • DAX

    17,669.57
    -167.83 (-0.94%)
     
  • Hang Seng

    16,224.14
    -161.73 (-0.99%)
     
  • NIKKEI 225

    37,068.35
    -1,011.35 (-2.66%)
     

Facebook taught its AI to speak math

Because wiping the deck with us over games of Go really isn't that tough anymore.

I speak two languages, English and Bad English. My understanding of math is significantly worse. In fact, I had to redo Calculus 2A four different times in college in order to graduate, mostly because I could never properly calculate a ladder's rate of acceleration as it fell away from a wall. You know, the sort of theoretical quandaries that really matter in our day to day lives.

My numerical idiocy aside, Facebook has trained an AI to solve the toughest of math problems. Real superstring stuff. In effect, FB has taught their neural network to view complex mathematical equations "as a kind of language and then [treat] solutions as a translation problem for sequence-to-sequence neural networks."

This is actually quite a feat since most neural networks operate on an approximation system: they can figure out if an image is of a dog or a marmoset or a steam radiator with a reasonable amount of certainty but precisely calculating figures in a symbolic problem like b - 4ac = 7 is a whole different kettle of fish. Facebook managed this by not treating the equation like a math problem but rather like a language problem. Specifically the research team approached the issue using neural machine translation (NMT). In short, they taught an AI to speak math. The result was a system capable of solving equations in a fraction of the time that algebra-based systems like Maple, Mathematica, and Matlab would take.

"By training a model to detect patterns in symbolic equations, we believed that a neural network could piece together the clues that led to their solutions, roughly similar to a human's intuition-based approach to complex problems," the research team wrote in a blog post released today. "So we began exploring symbolic reasoning as an NMT problem, in which a model could predict possible solutions based on examples of problems and their matching solutions."

ADVERTISEMENT

Essentially the research team taught the AI to unpack mathematical equations much in the same way that we do for complex phrases, like the example below. Instead of breaking out the verbs, nouns and adjectives, the system silos the various individual variables.

The researchers focused primarily on solving differential and integration equations, but, because those two flavors of math don't always have solutions for a given equation, the team had to get tricky in generating training data for the machine learning system.

"For our symbolic integration equations, for example, we flipped the translation approach around: Instead of generating problems and finding their solutions, we generated solutions and found their problem (their derivative), which is a much easier task," the team wrote and which I vaguely understand. "This approach of generating problems from their solutions — what engineers sometimes refer to as trapdoor problems — made it feasible to create millions of integration examples."

Still, it apparently worked. The team achieved a success rate of 99.7 percent on integration problems and 94 percent and 81.2 percent, respectively, for first- and second-order differential equations, compared to 84 percent on the same integration problems and 77.2 percent and 61.6 percent for differential equations using Mathematica. It also took FB's program just over half a second to arrive at its conclusion rather than the several minutes it required for existing systems to do the same.