When do you see a robot gauging human interests without interacting with a human?
It is already happening. But only with the emotions with expression. Not empathy or the visible aspects which are happening inside your body. So robots are able to read today any human emotion which comes on the face or the body language. They are capable of doing it. Today here are mobile robots which are able to capture and respond accordingly but they are very costly. They're not come mainstream at this point in time. So slowly you will see that this will happen. Sophia for example which was demonstrated. We are in touch with many companies who are working on similar technologies. It is happening and you will find them mainstream. Reading emotions. Yes. Empathetic. Maybe not. Because it will take a lot of times for even the scientists to decipher how many levels of neural networks we have within our own brain and that research is still happening.
So the depths of human evolution and then kind of replicating them in the robots is an ongoing task. And I'm sure in the coming years you will find more and more which are able to Strike independent conversations and modify those conversations based on the emotions which they're reading on the face or tone and tonality for the human being on the other side.