Wolfram Alpha, Mind Explosion
During SXSW this year I had the great fortune to see the keynote given by Stephen Wolfram. If you’ve not heard of him before, he’s the guy who created Mathematica, and more recently Wolfram Alpha, an online cloud brain. He’s an insanely smart guy with the huge ambition to change how we think.
When Stephen started, back in the early 1980s, he was interested in physics but wasn’t very good at integral calculus. Being an awesome nerd he wrote a program to do the integration for him. This eventually became Mathematica. He has felt for decades that with better tools we can think better, and think better thoughts. He didn’t write Mathematica because he loves math. He wrote it to get beyond math. To let the human specify the goals and have the computer figure out how to do it.
After a decade of building and selling Mathematica he spent the next decade doing science again. Among other things this resulted in his massive tome: A New Kind Of Science, and the creation of Wolfram Alpha, a program that systematizes knowledge to let you ask ask questions about anything.
In 1983 he invented/discovered a one dimensional cellular autonomy called Rule 30, (which he still has the code printed on his business cards). Rule 30 creates lots of complexity from a very simple equation. Even if one runs just a tiny program it can end up making interesting complexity from very little. He feels there is no distinction between emergent complexity and brain like intelligence. IE: we don't need a brain like AI, the typical Strong AI claim. Rather, with emergent complexity we can augment human cognition to answer ever more difficult questions.
The end result of all of this is the Wolfram Language, which they are just started to release now in SDK form. By combining this language with the tools in Mathematica and the power of a data collecting cloud; they have created something qualitatively different. Essentially a super-brain in the cloud.
The Wolfram Language is a 'knowledge based language’ as he calls it. Most programming languages stay close to the operation of the machine. Most features are pushed into libraries or other programs. The Wolfram Language takes the opposite approach. It has as much as possible built in; that is the language itself does as much as possible. It automates as much as possible for the programmer.
After explaining the philosophy Stephen did a few demos. He was using the Wolfram tool, which is a desktop app that constantly communicates with the cloud servers. In a few keystrokes he created 60k random numbers, then applied a much of statistical tests like mean, numerical value, and skewness. Essentially Mathematica. Then he drew his live Facebook friend network as a nicely laid out node graph. Next he captured a live camera image from his laptop, partitioned it blocks of size 50, applies some filters, compressed the result to a single final image and tweeted the result. He did all of this through the interactive tool with just a few commands. It really is a union of textual, visual, and the network.
For his next trick, Mr. Wolfram asked the cloud for a time series of air temperatures from Austin for the past year then drew it as a graph. Again, he used only a few commands and all data was pulled from the Wolfram Cloud brain. Next he asked for the countries which border Ukraine, calculated the lengths of the borders, and made a chart. Next he asked the system for a list of all Former Soviet Republics, grabbed the flag image for each, then used a ‘nearest’ function to see which flag is closest to the French flag. This ‘nearest’ function is interesting because it isn’t a single function. Rather the computer will automatically select the best algorithm from an exhaustive collection. It seems almost magical. He did a similar demo using images of hand written numbers and the ‘classify’ function to create a machine learning classifier for new hand drawn numbers.
He’s right. The Wolfram Language really does have everything built in. The cloud has factual data for almost everything. The contents of wikipedia, many other public databases, and Wolfram’s own scientific databases are built in. The natural language parser makes it easier to work with. It knows that NYC probably means New York City, and can ask the human for clarification if needed. His overall goal is maximum automation. You define what you want the language to do and then it’s up the language to figure out how to do it. It’s taken 25 years to make this language possible, and easy to learn and guess. He claims they’ve invented new algorithms that are only possible because of this system.
Since all of the Wolfram Language is backed by the cloud they can do some interesting things. You can write a function and then publish it to their cloud service. The function becomes a JSON or XML web service, instantly live, with a unique URL. All data conversion and hosting is transparently handled for you. All symbolic computation is backed by their cloud. You can also publish a function as a web form. Function parameters become form input elements. As an example he created a simple function which takes the names of two cities and returns a map containing them. Published as a form shows the user two text fields to ask for the city names. Type in two cities and press enter, an image of a map is returned. These aren’t just plain text fields, though. They contain are backed by the full natural language understanding of the cloud. You get auto-completion and validation automatically. And it works perfectly on mobile devices.
Everything I saw was sort of mind blowing if we consider what this system will do after a few more iterations. The challenge, at least in my mind, is how to sell it. It’s tricky to sell a general purpose super-brain. Telling people "It can do anything" doesn't usually drive sales. They seem to be aware of this, however, as they now have a bunch of products specific to different industry verticals like physical sciences and healthcare. They don’t sell the super-brain itself, but specific tools backed by the brain. They also announced an SDK that will let developers write web and mobile apps that use the NLP parser and cloud brain as services. They want it to be as easy to put into an app as a Google Maps. What will developers make with the SDK? They don’t know yet, but it sure will be exciting.
The upshot of all this? The future looks bright. It’s also inspired me to write a new version of my Amino Shell with improved features. Stay tuned.