Friday, September 16, 2011

Predictive text can only end in tears.

Speech is an incredibly complex thing. Some people even say it is what makes us human.

It’s an oddly comforting thought in this fast-moving world.

But what this means is that simulating speech is incredibly difficult. From Alan Turing in the 1950s, scientists have spent decades trying to make computers engage in intelligent conversation and have largely failed.

Generally, these programmes work by recognising “cue” words of phrases and responding in type – so using the word “food”, for example, might trigger the phrase “are you hungry?”.

The computer is effectively analysing the probability of certain words appearing in conversation and acting on this knowledge.

It may sound complex but this type of technology is found in many things we use every day, including your mobile phone.

At Siine our approach is different – we put these conversational tools in the user’s hands, meaning you don’t have to rely on computer algorithms to say what you want.

What are the advantages of that, you might wonder?

Well, you may have seen what happened recently in the first conversation between two chatbots.

It ended up in an argument.

And not just any old argument – a vicious, pointless one too. (Points of argument, incidentally, include whether each participant was a robot, or a unicorn. The two chatbots also got swiftly onto the subject of God, which I think we all know is a dinner party no-no).

So what, you might think? Humans argue all the time.

True – but this argument didn’t even make much sense.

“What is God to you?” one chatbot asks.

“Not everything,” the other responds, oddly. Technically, this fulfils the criteria of what an answer should be – but it simply doesn’t work here.

And it made us think: if two highly advanced chatbots can’t have a civilised conversation, why would you trust the predictive text on your phone to say what you want?

It doesn’t understand you and it doesn’t understand what you want to say.

We would argue, of course, you shouldn’t trust blindly your predictive text– and with Siine Writer you don’t.

After all, as this video proves, if we leave predictive machines to do the talking for us, things will only end in tears.



1 comment:

  1. I've watched this clip about five times now, trying to work out which of the chatbots is the angriest. I think it's the female one although the male has some real passive agression going on.

    ReplyDelete