Teaching computers to learn on their own has been the core aim of AI research, with the world’s largest tech companies, including Facebook, Google and Baidu, all racing to develop the best techniques. Although there have been breakthroughs in speech and image recognition, the machines still struggle to handle basic physical tasks.
Kindred AI, a Canadian start up, decided to tackle these issues applying a new, innovative approach based on the technique of immersive teleoperation. According to its founders, the best way to make robots as smart as humans, is to put them in human shoes, and teach them to learn the same way people do.
Unlike some of the most cash-flush corporations in Silicon Valley, Kindred is focusing not on chatbots or game-playing programs, but on automating physical robots. The company envisions a future in which intelligent machines work together with people to increase the efficiency of both. Kindred’s technology, if successful, could lead to a new type of AI and general-purpose robots that are capable of multiple tasks.
Even though it is too early to say if Kindred will definitely succeed in creating real machine intelligence, the company’s unique approach and all-star team have managed to convince some big investors so far.
Kindred has been focusing on bringing its work into the real world through partnerships with existing industrial robotics companies, but it declined to discuss specific discussions or agreements. The company also has started its pilot programs with a number of customers that it has yet to disclose. In 2017, the US retailer signed up with Kindred to test its first commercial product in Gap’s warehouses.
Despite obvious concerns that such technologies are actively looking to displace warehouse workers from their jobs, Kindred claims that their plan right now is not to eliminate jobs, but to move workers away from tedious, repetitive work and onto more challenging and rewarding tasks and make their work easier.
However, it is unclear if this blending process would even work, or if it would just end up with the creation of something very confused.