“Joy Cars are joystick-controlled cars for disable people that have been retrofitted with actuators and a joystick controller system,” says Daishi Watabe, director, Center for Self-Driving Technologies at SIT, who is heading the Yanba Smart Mobility Project. “They are the development of an industry-SIT collaboration.”
Tatsuma Okubo, a general manager at ITbook Holdings, is the project’s manager and describes the autonomous technology set-up as follows. A PC with the Autoware software installed takes in data from the various sensors including Lidar, cameras, and the Global Navigation Satellite System. The software uses a controller area network (CAN bus) to communicate the data to a vehicle motion controller that in turn controls two Joystick-controlled Joy System sub-control units: one for steering and one for accelerating and braking.
“Basically, our autonomous bus system substitutes voltage data from the joystick interface with voltage data from the Autoware electronic unit,” says Watabe. “We are developing two sets of remodeled Joy Car actuators for retrofitting in the Naganohara amphibious bus—one set for use on land, the other for water, which are remodeled land actuators.”
He says the autonomous control system will manage four major areas of control: vehicle water-in/water-out location-recognition; sensor-stabilization to counter ship rolling; self-localization techniques to manage changes in surrounding 3D views, given the water height in the dam can dramatically change; and a sonar-based obstacle-avoidance scheme. In addition, AI is also used to assist in obstacle detection, self-localization, and path planning.
When the dam was created, buildings and trees were left as they are.
“Given the height of the lake can change as much as 30 meters, we have to recognize underwater obstacles and driftwood to avoid any collisions,” says Watabe. “But because water permeability is low, cameras are not suitable. And Lidar doesn’t function well underwater. So we need to use sonar sensors for obstacle detection and path planning.”
What’s more, 3D views change according to the water level, while Lidar has no surrounding objects to reflect from when the bus is in the middle of the lake. “This means a simple scan-matching algorithm is not sufficient for self-localization,” explains Watabe. “So we’ll also use global navigation satellite data enhanced through real-time kinematic positioning and a gyro-based localization scheme.”
The biggest difficulty the project faces, according to Okubo, is the short construction period available, as they only have the off-season—December to March—this year and next to install and field test autonomous functionality.
Another challenge: Because winds and water flows can affect vehicle guidance, subtle handling of the vehicle is required when entering and exiting the water to ensure the underwater guardrails do not cause damage. Consequently, the group is developing a precise control system to govern the rudder and propulsion system.
“We’ll install and fine-tune the autonomous functionality during two off-season periods in 2020-21 and 2021-22,” says Watabe. “Then the plan is to conduct field tests with the public in February and March 2022.”
Besides tourism, Okubo says the technology has a huge potential to “revolutionize logistics” to Japan’s remote islands, which are facing a survival crisis due to declining populations. As an example, he says only a single driver (or no driver once full automation is introduced) would be necessary during goods transshipments to such islands. This should reduce costs and enable more frequent operations.