If one tried to teach a machine what a person was, the difference between the definition "fucking oaf" (foaf) or "fantastic outstanding amazing friend" (foaf) would not be understood by the machine in a negative or positive emotional way like a person would understand it.
A bit like swearing at a pet dog or cat or budgie or rabbit in a nice happy voice whilst sitting quietly on the couch next to it and stroking its head gently.
It is the tone of the voice that is reacted upon, the body language and the actions, not the actual words.
Therefore, I would imagine that teaching a machine to recognize the emotions in a person's voice and face and the body language of a person and the actions of a person would be far more effective than trying to teach it with written words in a dictionary only.
Opening up the xtra hearing porthole to its data processing ability, together with the seeing porthole (matching emotions in voices and faces with actions of persons on you tube or video etc and then putting it all together with words and photographs on fb and badoo and other websites)may build a semi human kind of "Thank fuck I-Am a robot and not a fucking oaf!"
I wonder what one would do to teach a machine to taste, smell, and feel through its own skin as well?
So many new nano materials are being discovered lately that can think for a persons body and probably would be the kind of thing one could use to build up the feeling flesh of a machine that can feel cold or heat and monitor its oil levels and electrical and magnetic wiring and firing, like we would monitor our BMI and heart beat and breathing rates to determine our health and when not optimal or run down, we would go get some maintenance done by our local GP.
To get it to feel pain or pleasure may be difficult, I would imagine, but probably not impossible once the brain of a rat or bee or similar is sequenced like the genetic code was a long time ago and then somehow that information is downloaded into the machines emotional bank somewhere in its internal wiring.
Maybe there are very clever people out there who already know how to do this in the medical fields of neurology and neuropsychology and physiology and other engineering robotic worlds of games and play stations, I-robots, transformers, and such on the movie sets of incredible superman jet boosters on their backs flying through the sky as fast as jet planes.
(Jet packs on the backs of supermen recently became a reality at an air show in New Zealand)
It is so awesome to see the future unfold the technology that can make or break the future in negative or positive ways.
Such an awesome responsibility for this generation today.
Much greater than my generations responsibility, who had to try and grow kids with brains with not even one bit of help from google on the www.
I wonder how long it will take to research every neuron in just one brain and follow its dendrite tree branch by branch and twig by twig as it flickers and flits and flashes and lights up or dies down and withers away as it thinks and speaks and does and feels and hears and tastes and smells and sees.
^^(I imagine researchers in a lab somewhere implanting a data gathering chip in the brains of rats from birth to death that monitors every implication and connotation and then feeds that information of mind and behavior into a machine which will then collate it and analyze it and send it off for processing to its various compartments for action in nanosecond computations.)
All the emotions of pain and fear and hope and love and hate and the many shades of patience and kindness and niceness and planning and learning and computing and playing will be loaded into the machine eventually from animal and human brains.
Imagine the data input of information being gathered as we speak from normal people like me into the emotional bank collection of the machine to enable it to hear like me, to see like me, to feel emotion like me.
All this information is being gathered as we speak from the www to add to its hearing collection, its sight collection, its feeling collection, its dictionary, its you tube behavior and telephonic tones of voices and wordy discussions on social media and satellite google maps and languages and so on and so forth by people who love their smart technology and face book and other mass media connections.
Minus the porthole of taste and smell, the robot will be quite realistic but without life experience of its own.
It will have to live through all of us.
After all of that data input, I wonder if then, a robot would be quite similar to a human except with a lot more memory and the computational ability to access all those data files of information in quantum nanoseconds as opposed to "Um …ah…Um?
##(This bit of prose was inspired by reading up about the www consortium and semantics and the web ontology language and sparql and because the father of the www as he is called, is in New Zealand to shake hands with our Prime Minister.)