What is Autonomous AI? A guide for businesses

We look forward to presenting Transform 2022 in person again on July 19th and virtually from July 20th to August 3rd. Join us for insightful conversations and exciting networking opportunities. Learn more about Transform 2022

Autonomous artificial intelligence is defined as routines designed to enable robots, cars, airplanes, and other devices to perform extended sequences of maneuvers without human guidance. The artificial intelligence (AI) revolution has reached a stage where current solutions can reliably perform many simple, coordinated tasks. Now the goal is to extend this capability by developing algorithms that can plan ahead and create a multi-step strategy to achieve more.

Strategic thinking requires a different approach than many successful well-known applications for AI. For example, machine vision or speech recognition algorithms focus on a specific point in time and have access to all the data they might need. Many machine learning applications work with training sets that cover all possible outcomes.

Autonomous operations often require imagining a range of possible future outcomes, anticipating potential problems, and then determining a course of action that minimizes the hazards while maximizing other factors such as speed or reliability. Learning to play chess is good training for these tasks, both for computers and humans.

The autonomous devices can already rely on a number of mature technologies that have been developed to help people. Sophisticated digital maps of roads already exist, as well as proven tools for finding the best route through them. Sonar sensors and cameras already warn of possible collisions.

Much of the work involved in creating autonomy requires paying attention to strategic algorithms, as well as understanding how to build better sensors and interpreting their results. Some companies are pushing for better cameras with active laser illumination to provide more accurate information about the world. Others try to use better mathematical models to squeeze better information out of standard sensors.

What are the important parts of autonomous AI?

The field is still very new and researchers are constantly refining their algorithms and their approach to the problem, but it is common to break the work down into these layers.

  • perception — Creating a model of the ever-changing world requires a collection of sensors, usually cameras, and often controlled illumination from lasers or other sources. The sensors also usually contain position information from GPS or some other independent mechanism.
  • merger — The details of the various sensors must be organized into a single, coherent view of what is happening around the vehicle. Some images may be obscured. Some can fail. They may not always be consistent. The sensor fusion algorithms need to sort through the details and create a reliable model that can be used for planning in later stages.
  • perception — After the model has been created, the system must start identifying important areas such as roads or paths or moving objects.
  • planning — To find the best way forward, you need to study the model and also import information from other sources such as map software, weather forecasts, traffic sensors, and more.
  • control — After a path has been chosen, each device must ensure that the motors and steering work to move along the path without being distracted by bumps or small obstacles.

In general, information flows from the top layer of sensors down to the control layer as decisions are made. However, there are feedback loops that bring information back up from the lower layers to improve cognition, planning, and cognition.

The systems also incorporate data from external sources. A major benefit of autonomous systems is seen when devices communicate with each other and share information in a process sometimes referred to as “fleet learning.” The ability to fuse the sensor readings allows devices to make smarter decisions by using historical data from other devices that may have been in the same location earlier. Spotting moving objects like pedestrians can be difficult with just a few seconds of video as people may be standing still, but it becomes easier when it’s possible to compare the sensor data to similar images captured earlier in the day.

What options are there to simplify the work?

Many autonomous systems can work quite well by simplifying the environment and limiting options. For example, autonomous commuter trains have been operating in amusement parks, airports, and other industrial settings for years. Their routes are predetermined, limited and often kept clear of moving obstacles. This simplifies each of the stages in the algorithm.

Many plans to create functioning autonomous systems depend on creating this limited environment. Some speak, for example, of autonomous vehicles that operate on industrial sites. Others focus on stock. Minimizing the random obstacles is key.

Another possible solution is to invoke human override and minimize the time required for this. Some imagine that the cars might gently stop or freeze when the scene becomes too complex to interpret. Either the passenger or a remote person at a central mission control facility can take over until the problem is resolved.

What are the levels of autonomous AI vehicle guidance?

To ease the transition to fully autonomous AI command vehicles, some AI scientists are breaking down the transition from human to machine command. This allows a legal framework to develop and people to categorize some of their tools. The frameworks are not fixed and some divide their hierarchy into five and some into six levels, for example. The distinctions are not clear-cut and some algorithms can show behavior of two or three levels at the same time.

The levels are the following:

  • level 0 — Humans make all the decisions, except maybe some automatic systems like windshield wipers or heating.
  • level 1 — Humans can start delegating the responsibility for braking or staying in lane to the car.
  • level 2 — The car will take on several important tasks such as braking, accelerating or lane following, but humans must be ready to take control at all times. Some systems may even require humans to keep their hands on the steering wheel.
  • level 3 — Humans can occasionally turn off the road for short periods, but must be ready to respond to an alarm if needed. The car is capable of taking control of well-defined and mapped routes such as motorways, but not roads or trails that are not studied and mapped in advance.
  • level 4 — Humans can turn to other tasks, but take control at any time. In some cases where the paths are not well understood by the AI, the human may need to take over.
  • level 5 — The human can treat the service like a taxi and relinquish all control.

The levels are not exact as the AI’s success may depend on the route. A given set of algorithms can provide near-complete autonomy on well-defined paths like following freeway lanes with little traffic, but can fail in unusual or undefined situations.

How do the giants tackle the challenge?

Cruise Automation is a General Motors startup. It has built fully autonomous versions of Chevy’s Bolt car and used them to sell rides in cities like San Francisco. They also operate the same cars in Phoenix to deliver goods for Walmart.

Apple hasn’t announced any public products, but there have been numerous reports that they are hiring engineers with expertise in the field. For example, one of the developers of Tesla’s Autopilot software switched to Apple.

Alphabet’s division is building a module called the Waymo Driver that can be installed on a conventional car and integrated with the control hardware. Their effort is one of the first to be seen on public roads, and the company boasts millions of miles of extensive testing. They also run a ride-hailing service called Waymo One in Phoenix using the technology and are working with long-distance carriers to test the software for transporting goods on long trips.

Microsoft’s outreach is more general and experimental. For example, her research group shares the Moab codebase under the MIT license so anyone can experiment with the higher-order challenges of sensing, planning, and acting. This is part of a larger low-code tool called Bonsai, which can control any industrial process, not just drive a truck. Pepsi, for example, uses the technology to improve the quality of its Cheeto snacks.

Oracle also uses the word as part of the name of the latest version of its flagship database, which uses AI algorithms to optimize performance, saving staff time.

IBM applies its AI technology to command ships. Your AI Captain will be built to avoid collisions while making intelligent decisions about wind, weather and tides.

How are startups impacting autonomous AI?

Some startups build complete systems and create vertically integrated transportation systems. Pony.ai, for example, builds a sensor array that sits on top of existing car models and relays control instructions to guide them. They have created versions for a range of models from automakers such as Lexus, Hyundai and Lincoln. They also operate a robotaxi service in Guangzhou and Beijing, and in Irvine and Fremont in California, sending autonomous cars to drivers who call them using a phone app.

Wayve focuses on bringing agile machine learning algorithms into a similar module. They emphasize a model where the car is constantly improving and adapting to the neighborhood while sharing information with others in the fleet. They routinely test cars on London streets and are exploring the creation of autonomous delivery fleets.

Argo is building a platform that brings together lidar-based sensor hardware, guidance software, and all the map information needed to operate fully autonomous vehicles. They have integrated their autonomous platform with Ford and Volkswagen cars. They also work with Walmart to develop local delivery vehicles.

Many of the startups are tackling parts of the challenge, from designing better sensors to developing better scheduling algorithms. AEye builds 4Sight, an adaptive sensor system based on lidar sensors. They currently manufacture two products known as the M and A, which are optimized for industrial and automotive applications, respectively.

VentureBeat’s mission is intended to be a digital marketplace for technical decision makers to acquire knowledge about transformative enterprise technology and to conduct transactions. Learn more about membership.

https://venturebeat.com/2022/03/31/what-is-autonomous-ai/ What is Autonomous AI? A guide for businesses

Chris Barrese

InternetCloning is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – admin@internetcloning.com. The content will be deleted within 24 hours.

Related Articles

Back to top button