Human Speech for Computers
After working with code a bit, I’m really looking forward to computers that can interpret language. Not artificial intelligence, per se. You see, I spent about an hour last night debugging a simple game I made. One of the sticking issues was that a UI button turned inactive when you beat the game, but remained active if you cheated and flipped the ‘you win!’ variable.
I needed to call a statement…GUI.enabled = true, or something like that. A computer could have found it instantly, if I could just communicate with it naturally. “Why is this button turning grey?” “The GUI is set to disabled.” “What’s the difference with this button, that’s active?” “The script doesn’t run this section of code that enables the GUI.” “What do I need to change?” “Put this line of code here.” It’s not particularly difficult language, which is why I’m hoping we can achieve it in the next decade or so.
The ultimate goal is to have a computer that can work as fast as I can think. Our primary limitation is just translating human thought to machine thought. If every idea could be executed perfectly…well, things would begin to be judged by the quality of their concepts, rather than the quality of their execution.
I was reminded of this when making an event in Google Calendar. I typed “Catch the 6:04 train.” Google changed it to “Catch the train” and set the start time to 6:04. In a very limited context – making calendar events – computers can understand human language and structure. We just need more of that.