Building a lunar exploration rover model

The autonomous driving unit is designed to explore the moon’s surface and is equipped with advanced technologies to deliver a universally applicable mobility platform to handle various payloads.


Photo Courtesy: Hyundai Motor Group

Hyundai Motor Group started building an initial development model of a lunar exploration mobility rover with aerospace partners. Additionally, Hyundai officials signed a research agreement with six Korean research institutes in the aerospace sector to run and support a consultative body to develop a mobility solution for lunar surface exploration.

The consultative body includes Korea Astronomy and Space Science Institute (KASI), Electronics and Telecommunication Research Institute (ETRI), Korea Institute of Civil Engineering and Building Technology (KICT), Korea Aerospace Research Institute (KARI), Korea Atomic Energy Research Institute (KAERI), and Korea Automotive Technology Institute (KATECH).

Hyundai decided the direction for the initial lunar exploration mobility development model and expects to complete the initial test unit in the second half of 2024, aiming to create a model with launch capability in 2027.

“Hyundai Motor Group has consistently stated its goal is to contribute to expanding human reach and the scope of human mobility experiences,” says Yong Wha Kim, executive vice president, and head of R&D planning & coordination center of Hyundai Motor and Kia. “The creation of the lunar exploration mobility development model not only reflects this goal, but also shows our ambition to achieve tangible results in the face of significant challenges. With the rover’s development, we are moving beyond land, sea, and air mobility to expand into space mobility.”

For the rover, as part of a multi-purpose mobility platform, the group is using Hyundai Motor Co.’s and Kia Corp.’s advanced robotics and autonomous driving technologies, driving system, charging parts, and Hyundai Rotem’s robot manufacturing technology.

Hyundai’s components will occupy the lower section of the rover while the upper section will consist of scientific payloads for lunar surface exploration. The rover will have thermal management function and radiation shielding to withstand the extreme environment of the lunar surface.

Once the lower part of the rover is developed, the consultative body expects it to function as a mobility platform, supporting an upper part holding advanced technologies for digging, excavation, and human exploration of the lunar surface. The goal is delivering a universally applicable mobility platform to handle various payloads.

Following development, testing, and refinement, the plan is to land the rover near the south pole area of the moon to carry out scientific missions. The solar-powered, autonomous driving mobility unit will weigh around 70kg (154.3 lb).

Before the rover can be sent to the moon, Hyundai will conduct mission-based performance testing of the development model in an environment similar to the moon’s surface in preparation for the lunar mission and to make refinements based on the test results.

Hyundai Motor Group

NEWS AND PRODUCTS

Fundraising for machine tool AI

PHOTOS COURTESY OF RESPECTIVE COMPANIES

Productive Machines, an artificial intelligence (AI) startup from The University of Sheffield Advanced Manufacturing Research Centre (AMRC), raised £2.2 million ($2.75 million) in seed funding so its advanced machine tool process optimization technology will be available to a wider range of manufacturers.

Productive Machines will use the funding to deliver AI technology as a fully-automated Software-as-a-Service (SaaS) product and to expand its team to more than 20 people.

Founded by Dr. Erdem Ozturk (CEO) and Dr. Huseyin Celikag (CTO), Productive Machines is commercializing the results of a six-year AMRC research project on machining dynamics. This research covered process and machine tool interactions, including how cutting forces and resulting vibrations affect machine tool performance.

Productive Machines’ powerful computational model predicts and mitigates the influence of harmful vibrations at every stage in metal and composite milling jobs. It uses a digital twin to determine the best parameters for each machine tool and production run, eliminating wasteful configuration experiments and ensuring milling jobs are right the first time.

The technology has already been deployed at 10 major manufacturers, including Renault and MASA Aerospace. Machines configured by Productive Machines can produce parts in half the original time, delivering significant surface quality improvements due to mitigation of chatter vibrations created by instability in machining processes. Users report that cutting tools last up to 30% longer on optimized machines.

Productive Machines

Automating cleaning of complex aeronautic parts

Visionic designs off-the-shelf optical guidance and control solutions for complex manufacturing processes, aiming for constant improvement of industrial performance. Fuzzy Logic’s no-code software allows non-experts to create, simulate, and control a robot cell in real time, aiming to democratize industrial robotics. Fuzzy Logic and Visionic are removing the technological and financial obstacles to the robotization of applications such as pressure cleaning and decontamination of engine parts in the aeronautics industry.

Machined engine parts must be rendered particle-free before assembly and are subjected to meticulous cleaning at very high pressure. The noisy, arduous processes currently performed manually expose operators to high pressure waterjets, up to 60 bar, and the pollution and noise increase the risk of musculoskeletal disorders (MSD). Aeronautical manufacturers and subcontractors are having difficulty recruiting candidates for these jobs, which are also highly regulated and closely monitored by the unions, so manufacturers are seeking to automate them. However, they are confronted with complexity of robotization due to the diversity and number of parts, and the lack of availability of roboticists.

Visionic designed a robotic cell including a chassis, a robot, high-pressure hydraulic circuits, a filtration system, and a closed-circuit particle recovery system. It’s controlled with Fuzzy Studio, allowing robotization of complex tasks at lower costs and without expertise in robotic programming.

Generating robotic trajectories requires long and complex programming by a roboticist. Automation is even more complicated for applications such as high-pressure cleaning of aeronautical parts with variable geometries. With Fuzzy Studio, the complex trajectories are automatically generated in a few clicks using 3D information from objects placed in a virtual cell. It’s possible to add an unlimited number of waypoints to the trajectories, saving users time and freeing them from dependence on robotics experts.

Fuzzy Logic

Visionic

Simplifying PCB design work for robotic solutions

Summit Designer delivers standard market-ready printed circuit board (PCB) designs for robotic applications, saving significant cost, work, and time.

It’s an open-source PCB design library featuring a diverse, vast offering of market-ready application-specific PCBs designed, supported, and updated by experts and is an optimal way to develop compact robot joints, multi-axis automated guided vehicle/autonomous mobile robot (AGV/AMR) systems, and industrial end-effectors.

Easy to use, every design is open-source and consists of a complete, fully customizable and documented Altium project.

Users choose and add their desired modules to create a fully functional servo drive design for a market-ready robot. Options are designed to satisfy the most common requirements, such as connectors, communication protocols, safety functions, and motor and encoder specifications. Users receive a fully scalable and modular download file, ready to edit at their convenience.

Celera Motion

AMRs with visual simultaneous localization and mapping technology

Visual Simultaneous Localization and Mapping (Visual SLAM) technology enables autonomous mobile robots (AMRs) to make intelligent navigation decisions based on their surroundings. Using artificial intelligence (AI)-enabled 3D vision to perform location and mapping functions, ABB’s Visual SLAM AMRs make production faster, more flexible, efficient, and resilient while taking on dull, dirty, and dangerous tasks so people can focus on more rewarding work.

Visual SLAM combines AI and 3D vision technologies for superior performance compared to other AMR guidance techniques. Offering advantages over other forms of navigation such as magnetic tape, QR codes, and traditional 2D SLAM requiring additional infrastructure to function, Visual SLAM AMRs are being embraced by companies for an expanding range of production and distribution tasks.

Visual SLAM uses cameras mounted on the AMR to create a 3D map of objects in the surrounding area. The system can differentiate fixed navigation references such as floors, ceilings and walls that need to be added to the map, and objects such as people or vehicles that move or change position. Cameras detect and track natural features in the environment enabling the AMR to dynamically adapt to its surroundings and determine the safest, most efficient route to its destination. Unlike 2D SLAM, Visual SLAM doesn’t require additional references and offers accurate positioning to within 3mm.

By eliminating the need to change the environment, stop production, or add infrastructure, Visual SLAM technology helps reduce commissioning time by up to 20% compared to 2D SLAM. The technology can be used at scale with fleets updated remotely and is also secure, as it analyzes raw data only, with no visual images saved on either the AMR or a server.

ABB Robotics
June 2023
Explore the June 2023 Issue

Check out more from this issue and find you next story to read.