Tasks that are easy for humans can be difficult for robots. Pallet handling is a perfect example.
Human operators are great at picking pallets of all shapes and sizes, even if there is plastic wrap covering the pockets, or if the pallets are skewed in the rack. Automated lift trucks work best when pallets are clean, in good condition, and oriented neatly.
Conversely, repetitive tasks requiring precision are easy for robots, but human forklift operators can find it challenging to deliver the same level of precision. Automated lift trucks can place down pallets with consistent, miniscule gaps between units, while human operators will naturally vary their placement.
The big question manufacturing and warehousing operators need to ask is what workflows can be automated with today’s technology?
Setting up tech for success
Tom Panzarella, chief technology officer at Seegrid, said robotics customers and integrators need to mutually set each other up for success.
When Seegrid proposes solutions to problems provided by its customers, it defines the technology’s operating constraints. Usually those constraints aren’t burdensome for customers who are willing to meet technology at its current stable-functioning level for their deployments.
“It doesn't make a lot of sense to try to push the edges of the tech [when deploying commercially],” Panzarella said. “What you really want to do is set the tech up for success.”
David Griffin, chief sales officer at Seegrid, said because automated lift trucks are a relatively new technology, new customers haven’t developed an appropriate set of expectations just yet. But, as the materials handling market gains more experience with automation, expectations will begin to fall in line with reality.
“Many of our customers that have been at this for a long enough time, they have a really good understanding of what is and isn't possible,” Griffin said. Those customers work with Seegrid to choose the right applications, environments, and facilities to implement automation, knowing that not all processes can be automated.
Many trucks doing many things at the same time
Griffin said lots of customers start by trying to automate the hardest tasks in their workflows, falsely assuming that automated lift trucks can do everything a person can. Although robots can’t handle overly complex tasks, they can still add value for industrial customers.
“Why would you choose not to automate 90% of your processes because there's 10% that's just too difficult?” Griffin asked.
Repetitive tasks are the best candidates for automation, especially because they’re boring and mundane for people. Griffin said automated lift trucks handle repetitive tasks very well, allowing warehouse operators to call upon their employees for tasks that require human capabilities and thinking, improving labor productivity.
While sensing, picking and placing pallets is a necessary capability, Griffin said appropriate fleet management software to coordinate vehicle activities is also necessary if warehouse operators want to deliver precise pallet conveyance.
“Can you get the right truck to the right place to handle the activity?” Griffin asked. “Can you optimize the job throughput?”
Strong fleet orchestration software is necessary to deliver repeatable and precise automated pallet movements, especially at a commercial scale. At a local level, each lift truck needs to safely handle pallets and navigate. But from a throughput perspective, facility operators don’t choose automation because they want one robot - they want to add an entire fleet of autonomous vehicles to improve speed and accuracy.
“It's comparatively fairly easy to get one truck to do one thing, one time,” Griffin said. “To get many trucks doing many things at the same time is quite the challenge. That is where the [fleet management] software comes into play.”
Computer vision delivers a new level of operational discipline
The strength of automation - more precise pallet placement - can actually be a hindrance to implementation. Griffin said automation requires a level of operational discipline that industrial facilities and their operators might not be equipped for.
In manual forklift operations, pallets can be placed “close enough” to a target location. But because automated lift trucks work best when loads are more consistently placed, they may struggle to pick pallets placed by manual operators.
“If you want to automate it, you have to be more disciplined with what you're putting and where you're putting it,” Griffin said. “The level of rigor that's required for automation is hard for customers to get used to.”
AI and computer vision systems enable automated lift trucks to deliver precise pallet placement, as well as adapt to imperfect loads, and real-world dynamics present in industrial environments. From a technical capability standpoint, Panzarella said human vision is far more sophisticated than today’s computer vision systems - especially with regards to object detection and localization.
Humans only need to see a handful of examples to learn what a pallet looks like. We can easily differentiate palletized loads from obstacles. And, humans are really good at identifying fork pockets in pallets, or in any other carrier that is pickable by a fork truck that might not be a pallet.
“You have to teach a computer, ‘This is what a pallet looks like,’ to include all of its permutations,” Panzarella said. “Getting a computer to [reliably detect pallets in real-world production environments] is non trivial.”
Automated lift trucks rely on sensors to collect data about their surroundings, which don’t immediately make it clear where pallets are located. Compared to human vision and intuitive perception, which can seamlessly locate objects, machine vision requires more discreet steps.
”What the computer gets is just a digital sampling of the space,” Panzarella said. “It is the job of the software algorithms processing the sensor data to precisely localize the pallet in real-time.”
Cameras and sensors provide context
Seegrid uses a variety of cameras lidar, and other sensors to gather information about the environment enabling the autonomous mobile robots (AMRs) to accurately pick, transport, and place pallets.
To facilitate feedback to its pallet manipulation stack, Panzarella said Seegrid uses time-of-flight cameras and lidar to estimate the pose of a pallet when picking or the facility infrastructure (e.g., tables and racks) when placing.
Pallets in the real-world are often damaged or presented to the AMRs with high levels of variability. Computer vision systems need to be resilient enough to tolerate these disturbances to reliably determine the position and orientation of the pallet as well as estimate the location of the fork pockets. One particular challenge Panzarella noted is occlusions, often from facility infrastructure, that blocks the camera’s line-of-sight to the pallet.
To get the best view of pallets, Seegrid places sensors between the forks. Panzarella said, in general, aligning the camera with the center of the pallet provides the most favorable, unoccluded view of the fork pockets
“Depending upon how your algorithms work, the pose of the camera with respect to the pallet could significantly affect your systems ability to properly estimate the fork pockets.”
Additionally, Panzarella said that image resolution, camera optics and its field of view impacts the distance at which a pallet can be seen by an AMR.
Sensors are not all created equally either. Depending upon the type of sensor, the underlying physics, and vendor-specific quirks the data will be different.
As a result, a software algorithm developed for one sensor may not work with a different piece of hardware. “There are so many variables that do not make pallet detection a one scoop of vanilla ice cream-type problem,” Panzarella said. “You have got to take in all the context and constraints of the problem you are trying to solve.”
AI and machine learning software help solve industrial problems
Panzarella said Seegrid takes a “belt and suspenders” approach to functional safety, building its vehicle hardware and software over and above regulatory requirements and specifications. Sometimes this approach can present challenges.
For example, if a piece of debris (e.g., slip sheet) is present between the AMR and the pallet or occludes the fork pockets, Seegrid’s automated lift trucks identify it as an obstacle and stop. This safety feature can help prevent injuries because the robot will stop for any obstacle, which could include people or limbs in the way.
“The edge of the technical problem right now is classifying things like those obstructions to say, ‘Well, that's actually an obstruction I can drive through and I should drive over,’ whereas something else, ‘That's not something I should drive through or drive over,’” Panzarella said.
“I think there's still a lot of room for technical development in that space,” he added. “The balance you've got to strike is uptime and availability of the vehicles actually doing the work while also doing it safely.”
Training AI software systems to support the variety of pickable objects, not just pallets, isn’t straightforward.
Panzarella said it’s unscalable for humans to describe the salient features that define all pickable objects to a computer because of the diversity of load types Seegrid deals with in the manufacturing industry. Instead, Seegrid is employing deep learning-based feature detectors to surface those features through examples and data.
However, machine learning might not produce outputs that meet the safety requirements of what automated lift trucks require. So rather than treating its neural network as a black box and accepting the output as truth, Seegrid is applying a hybrid-AI approach. It validates the output of the neural network using classical computer vision techniques in real-time.
“We're solving industrial problems with the data sets we have,” Panzarella said, noting that the AI and data sets Seegrid is using are different from the massive, generalized AI models described in popular media. “Industrial use-cases typically are not afforded access to internet-scale data. Relatively speaking, we are dealing with small data.”
Data sharing to demonstrate what’s possible
As ever-advancing technology is deployed into industrial environments, facility operators will expect them to behave in predictable ways, while decision makers will want to keep pushing the technology to the highest edge of performance.
While technology has been modernized, Panzarella said people need to get comfortable with new mindsets regarding data sharing. Safety and performance can only improve if data sharing policies and operational processes catch up.
“If you want ‘artificially intelligent autonomous machinery optimizing your workflows,’ you need to give it its nourishment - and its nourishment is data,” he said. “You need modern policies in place allowing for access to data from the field so that the models can learn and improve over time.”
Data is a useful tool not just for development, but also for eliciting capabilities to customers. Griffin said Seegrid’s customers make significant investments in automation systems.
Customers want to know if their robots are performing sufficiently to accomplish their business cases. They want to see which robots are doing what jobs so they can evaluate throughput and efficiency gains.
“From a data perspective, anything that we as an industry can provide to the customers to let them know how automation is performing is super helpful,” Griffin said.
In terms of return on investment (ROI), Griffin said many customers can perform straightforward calculations to determine if automation is helping. In many cases, reduced staffing is a key performance indicator that demonstrates the benefits of deploying robots.
But in some situations, ROI calculations become complicated. Some customers don’t choose applications where added value can be clearly determined. Griffin said some of Seegrid’s customers are interested in testing new technologies, and less interested in proving a business case.
Regardless, adding robots to operations has demonstrated clear benefits in the long run.
“Seegrid over time has sold thousands of robots,” Griffin said. “And the reason we sell thousands of them is because we save our customers a lot of money.”
Want to learn more about machine vision? This article was featured in the August 2024 Robotics 24/7 Special Focus Issue titled “Machine vision to increase robot precision.”
About the Author
Follow Robotics 24/7 on Linkedin