AI Robotics

AIs effect on robotics and development

Robotics and sensory systems must reach and surpass human levels in order to create sentiences that can enhance AI’s self-learning and self-correction. This, in turn, is necessary to enable experimental automation that can shorten development cycles.

This post is the third in the series on AI’s disruption of innovation

This blog entry is the third in a series about the disruptive impact of a technology based on a new form of intelligence that is self-learning, universally enabling, and allows for deep customization:

  • Where does AI impact today?
  • What is new and disruptive about AI?
  • AI’s effect on robotics and experimental automation
  • AI’s effect on climate and meteorological research
  • AIs impact on social sciences
  • AI’s effect on teaching, learning, and psychotherapy
  • AI’s impact on biotechnology and pharmaceuticals
  • AI’s effect on materials research and quantum chemistry
  • AI’s effect on theoretical physics and mathematics
  • What should investment strategies in AI take into account?

The Area That Needs Development for AGI to Emerge

In order to develop AGI, sentiences must especially be created. They are the prerequisite for AI’s self-correction and thus for increased self-learning. But creating sentiences requires above all the development of sensory systems at human level—or better. This will also accelerate the development of robotics.

This is currently the greatest bottleneck on the path toward AGI. Big Tech knows this, and therefore their interest in investing in the field is rising significantly. This creates the basis for major value uplifts in a sector that is otherwise moderate in size.

Robotics Depend on Senses where e.g. vision is under development ...

Among the senses that are still particularly lacking in development are advanced vision, finer tactile sensitivity, and the sense of smell.

Hyperspectral vision: Humans perceive colors as different combinations of RGB. Hyperspectral imaging, however, can create more than 100 distinct wavelength bands, going far beyond the human spectrum, including UV, infrared, and more.

This can enable rapid diagnostics of chemical compositions, biological changes, water content, etc. Today, the technology is applied at TRL6–7 level in materials laboratories and plant research. In the future, it may be especially useful in mineral exploration, food quality control, and medical diagnostics. In itself, hyperspectral vision thus opens new opportunities and increases efficiency.

… and tactile sensing, as well as …

Tactile sensors. Human touch is mainly limited by the coarseness of nerve endings and the structure of the skin. New types of e-skin use microsensors embedded in flexible graphene-thin layers. These can register texture, elasticity, temperature, and microscopic vibrations. This allows, for example, the handling of fragile biological samples or the performance of precise surgical simulations. Tactile sensors are particularly important for automated laboratories.

MIT has likewise developed a GelSight sensor with internal camera optics through a transparent silicone surface. This enables the robot to “see” touch at the micrometer level.

In 2023, the University of Glasgow launched a graphene-based e-skin capable of “healing” itself from micro-damage.

… the sense of smell …

Chemical “noses” and “ears”can detect gases, particles, dissolved molecules, or sound signatures from chemical reactions. They are especially useful in medical diagnostics and food control for process monitoring. Applications include safety monitoring in cases of environmental hazards, as well as process control in fermentation, for example, enzyme production.

In addition, major development is underway in acoustic chemical sensing using ultrasound or acoustic waves. This can measure density and viscosity, and eventually also in the infrared spectrum. Potential applications include process monitoring in the chemical and pharmaceutical industries, as well as in medical diagnostics.

… and finally, software for real-time coordination

In the future, the most important “enabler” will likely come from software that can integrate sensory inputs in real time, like the human brain. This multimodality requires improved pattern recognition, which AI provides special opportunities for (vector relations, etc.).

The next steps will therefore focus particularly on improving robots’ proprioception—their ability to perceive their own position, movement, and exertion in space; in other words, sensory coordination. Carnegie Mellon University, for example, has developed tactile systems where robots learn gripping techniques by simulating thousands of variations before testing them physically (as humans do).

Big Tech is very active, as Developers and as Investors

Most Big Tech companies are active in this area. For instance, Amazon focuses on further developing its Vulcan robot. Its tactile sensing in the “hands” can already handle 75% of warehouse objects, thereby reducing human strain. In addition, Atlas robots are now being commercially deployed in Hyundai factories. Finally, both MIT and Google DeepMind are focusing on visuomotor AI cameras and robots, as well as on integrating relatively inexpensive sensors. This has significantly increased robots’ self-corrections and self-learning.

Experimental Automation Can Multiply Development Speeds ...

Experimental automationis another highly significant area of development. Commercially, the primary goal is to reduce time-to-market and development costs.

One thing is to develop concepts and test them up to TRL 3–4 level. This mainly depends on the combination of creativity and engineering and can happen in a decentralized and spontaneous manner. But taking a product further toward commercial value usually requires a longer laboratory process, where processes are continuously developed, tested, and refined. This is often the longest and most expensive part of the development cycle. Therefore, AI-driven automation will provide substantial commercial advantages, from dramatically shorter development cycles, to lower development costs, to more precise documentation and quality assurance.

… While Also Improving the Quality of Development

Clinical tests can in the future run 24/7 and with significantly greater precision (for example, in time-critical trials). Moreover, the iterative process is optimized because reinforcement learning and “closed-loop science” are used more consistently. This improves the ability to review, structure, and benefit from past experimental data. 

Reviewing and cross-testing historical experimental data also creates opportunities to discover new phenomena for further testing. For instance, the University of Liverpool has developed the robotic chemist Ada, which can autonomously discover new catalysts. DeepMind’s AlphaFold also made it possible to predict protein structures, which robots have since tested automatically. Furthermore, researchers now program experiments remotely at laboratories such as OpenTrons and Emerald Cloud Lab. 

Finally, Carnegie Mellon University has gone a step further. Their Autonomous Chemistry Lab allows AI to formulate hypotheses, carry out experiments, and learn from the results independently. They estimate that this has increased development speed by a factor of 10–20..

But it can also turn the Development Process Upside Down

Altogether, advances in robotics and experimental automation primarily create opportunities for leaps in development speed and for significant reductions in development costs. 

But perhaps most importantly, hypotheses may eventually be generated directly from raw data and simulated digitally, without a human first needing to “understand” the entire process. This can not only accelerate development but also open the door to fundamentally new insights.

Together with multisensory AI integration, AI may also achieve situational awareness beyond that of humans. This could enhance safety in sensitive processes and enable deep customized products that are currently out of reach, for example, nanobots.

Related posts

AI’s effect on climate and meteorological research

AI’s computational power and its ability to integrate vast amounts of climate and meteorological data ...

Why Is AI Disruptive?

AI is disruptive because it enables multiplying the pace of evolution and innovation

Where does AI impact today?

The answer to where AI impacts businesses today particularly depends on who You ask ...

AI as Catalyst for Life

AI may trigger a quantum leap in the development of synthetic biology, acting as a catalyst for life ...

Are humans deterministic, and is AI?

Are humans unique, or are we just as deterministic and replicable as AI algorithms?

Does AI Learn from Data It Has Created Itself?

If AI learns from data it has created itself, there is a risk of self-reinforcing ...