Vision-guided Robotics: In Search of the Holy Grail

While vision-guided robot “bin-picking” of randomly located parts has long proved an elusive goal, there are signs that the technology may now finally be set to emerge.

Aw 5290 Bin Picking02 Fanuc

二十多年,机器视觉practitioners have been predicting the commercial emergence of robotic random bin-picking—the ability of vision-guided robot arms to locate and pick individual parts from a jumble of parts piled haphazardly in a bin or container.

Highly flexible, random bin-picking systems would produce major savings for manufacturers, the early proponents declared. Human workers would no longer be required to unload incoming parts bins shipped by suppliers. And on machining and production lines, randomized parts piled in bins could replace expensive fixturing, tooling and component feeders used for part orientation.

Unfortunately, the widespread exuberance for the technology in the early 1980s gave way to hard reality later on. The “bin-picking problem” proved more difficult than anticipated. Bin-picking systems developed in the laboratories, it turned out, didn’t translate well into real-world factory applications. “The industry found out that this wasn’t so easy. You had things like partial occlusion with overlapping parts, and lighting variations that really stymied the progress of bin-picking,” notes Adil Shafi, president of Shafi Inc., a Brighton, Mich.-based software solutions provider that specializes in vision-guided robotics. A further complication was that computers of the time tended to choke on the massive amounts of processing required to successfully recognize parts piled randomly in a bin, and to calculate their 3D position and orientation for picking.

Eventually, less taxing 2D vision robot-guidance tasks—such as picking singulated parts from a moving conveyor —became relatively common. But widespread random bin-picking applications never materialized, and to this day remain a challenge. “It’s been a desired, but very elusive and unreachable goal for a long time,” Shafi observes. “A lot of people have referred to bin-picking as the Holy Grail of robotic material handling.”

Now, however, there are growing signs that vision-guided robot bin-picking may be finally moving closer to reality. Robot vendors including Fanuc, Motoman and Staubli have recently demonstrated bin-picking systems at trade shows. At least one North American systems integrator is planning this year to offer “semi-random” and random bin-picking as “standard product” technology for certain types of parts. And a number of automotive industry end-users are experimenting with bin-picking technology, with some early applications already in production.

At a TRW Automotive plant in Woodstock, Ontario, Canada, for example, Manufacturing Engineer Todd Denstedt says that two bin-picking systems designed to unload brake rotor castings are currently working well, and are scheduled to go into production around mid-year. Meanwhile, a Toyota Motor Manufacturing plant in Buffalo, W.Va., is already using five robotic bin-picking systems on its engine part machining lines. Those systems rely on ABB robots equipped with 3D vision technology supplied by Braintech Inc., of North Vancouver, British Columbia, Canada (see sidebar, p.32). All five were all installed during last year’s second half, says Bob Welch, assistant manager of engineering at the Toyota plant. Though Welch declines to provide numbers, he says the five systems met Toyota’s capital expenditure requirement for return-on-investment (ROI) in two years or less.

No one is saying that totally random bin-picking for all kinds of parts is yet practical. Parts such as springs or complicated components with the potential to become tangled are unlikely candidates for early bin-picking applications, for example. Instead, vendors are focusing initially on parts with simpler, easily recognizable geometries, including cylindrical or circular shapes. And in fact, both the TRW and Toyota systems are examples of what industry sources variously refer to as “semi-random,” “semi-structured” or “semi-constrained” bin picking, in which parts are not totally random, but are in some way loosely located in the bin.

Unstack ‘em

In the TRW application, for instance, when brake rotor castings arrive at the plant from suppliers, they are arranged in stacks within wooden bins. “The parts are stacked, and there are no dividers, so they do tend to move during shipment, but they’re fairly well situated,” Denstedt says. Measuring about three-by-four-by-three foot deep, each bin holds 100 to 140 castings.

天合此前依靠操作员装备ped with lift assist devices to unload the castings, which can weigh up to 331/2 pounds each. In some cases, the plant has also used non-vision equipped pick-and-place systems. The blind pick-and-place systems run on a 20-second cycle time, but due to parts shifting in the bin, they occasionally miss a part, and must then use another 20-second cycle to pick a different part from the bin.

The blind systems work well for low-volume lines that require a casting to be picked and placed every 60 seconds, says Denstedt. But when the plant had a recent requirement for a new line that would run on a 15-second cycle time, it turned to a vision-guided bin-picking system provided by JMP Engineering Inc., a London, Ontario, Canada, systems integrator.

JMP delivered two systems to TRW last November. Each uses a Fanuc 710ib-45 robot from Fanuc Robotics America Inc., Rochester Hills, Mich., and works with a camera and VisionPro software from machine vision vendor Cognex Corp., Natick, Mass., as well as Reliabot software from Shafi, which handles robot/vision integration and robot guidance functions. A Cognex camera mounted on the robot arm is used to find the 2D x-y coordinates of parts, and an infrared (IR) sensor also on the arm is used for the z axis, to find the height of the top part on any given stack in the bin, says Ken McLaughlin, JMP flexible manufacturing group manager.

When an operator places a bin loaded with stacked castings into the system and hits the go button, the robot arm first raises up to a high level and the camera takes two images covering the field-of-view to determine the x-y locations of all of the stacks, says McLaughlin. It then uses the IR sensor to determine the height of each stack, and decides on a sequence for picking. “If there’s one stack that’s abnormally high, it will continue picking off that stack until it’s level with the rest of the stacks,” notes Denstedt.

One more look

Before picking each casting, the system takes an additional image of the part, to ensure that its position hasn’t shifted since the last pick. Thanks to its vision-guidance system, the robot can pick parts even when a stack is leaning by up to 20 degrees, says McLaughlin. A specially designed magnetic end-effector incorporates a compliance mechanism to ensure that parts oriented at an angle can be picked. Picked parts are placed by the robot on a conveyor for transport to a machining system. The entire process is accomplished in a 12-second cycle time, says McLaughlin, sufficient to meet TRW’s 15-second requirement.

“Everything has come through, and they are meeting the cycle time that we need,” confirms Denstedt. He says the bin-picking systems are scheduled to go into production around mid-year 2006 as part of a line timed to a vehicle model-year start-up. Expected benefits include improved ergonomics compared to manual lift-assisted unloading, better throughput and more flexibility, Denstedt notes.

在天合的发展系统中,JMP决定“保持it simple” with what McLaughlin calls “21/2D vision” that finds only the x, y and z coordinates of a part, while using end effector compliance to compensate for variations in part roll, pitch and yaw. But if an application requires it, McLaughlin says he is confident that JMP could develop a full 3D system—using two cameras working in stereo mode, for example—that could reliably locate parts in six degrees of freedom.

In fact, JMP is planning later this year to offer bin-picking as a “standard product,” says McLaughlin. “You’ll be able to buy it with a number of options, and you can just configure the vision guidance portion of it for the part you want picked.”

Complex shaped components such as manifolds or transmission parts will likely require a semi-random solution, perhaps requiring dunnage within a bin to maintain part orientation, says McLaughlin. But relatively simple parts such as brake rotors that are circular or cylindrical in shape could be picked either in semi-random or completely random mode, he believes.

One of ten

To be sure, despite growing optimism in some camps regarding prospects for totally random bin-picking, the majority of bin-picking applications that are in use today are still semi-random or semi-structured.

JMP used vision technology from Cognex and Shafi for the Fanuc robot-based application deployed at TRW. But Fanuc also offers bin-picking capability using its own 3D vision technology. And at Fanuc Robotics America, Material Handling General Manager Dick Johnson says that out an estimated 10 to 12 Fanuc bin-picking systems being used today in production, only one is totally random. That system is used to pick connecting rods at a Japanese automaker. Another random system has been ordered by the same car maker, Johnson adds.

Fanuc’s bin-picking technology relies on a fixed-mounted camera above the bin area that does a “rough find,” identifying and ranking 15 candidate parts for potential picking, says Johnson. The software finds parts by using a pattern matching algorithm that compares what it sees to a stored database of sample images of the part to be picked, taken from multiple directions and orientations. The overhead camera is a standard, 2D gray-scale camera.

Mounted on the robot arm is a Fanuc sensor head that includes a second gray-scale camera and a laser striper that is used for what Johnson calls the “fine find” of individual parts. The gray scale camera finds the x, y and roll of the part, while the laser striper, mounted at an angle to the camera, provides the z, yaw and pitch based on structured light and triangulation techniques. “The robot will go to the first part identified by the rough find system, and takes a look,” says Johnson. “If it likes what it sees, it will pick it. If it doesn’t like it, the arm moves on to the second part.”

A variety of conditions might cause the system to skip over a part for picking, says Johnson. “We have to worry about hitting the sidewall of the bin, and we have to worry about having enough gripping force, if a part is not on top of the pile but has other parts on top of it.” The more parts skipped over, the longer the system cycle time.

Through continuous fine-tuning of all components of the system, Fanuc has reduced the overall average cycle time required to pick and place random parts from about 22 seconds in 2002 to about 15 seconds today, Johnson says.

End-user interest in random bin-picking is on the rise, Johnson relates. “I am evaluating probably three to five different company’s parts for random bin-picking right now,” he says, adding that some or all of those applications could become reality during 2006.

Yes, but...

Though progress is being made, some vendors are also quick to raise a flag. At Motoman Inc., a West Carrolton, Ohio-based robot vendor, Senior Marketing Director Carl Traynor agrees that “a significant point of inflection” may be near for random bin-picking technology. Motoman has formed a strategic relationship with Shafi for applications such as bin-picking, and is working with customers on several applications, he says. But Traynor also warns that the industry must be careful not to oversell the technology.

“People have been waiting so long for this, and we really have had some success,” he notes. But bin picking “is not a panacea and end-all for every application,” Traynor adds. “People need to recognize that there are certain applications where it really makes sense, and others where it doesn’t.”

Shafi concurs. Given earlier vision technology failures in the 1980s and 1990s, he says, bin-picking proponents must proceed with caution. For example, Shafi points out that in some bin-picking applications, a reliable pick-and-place operation may require two stages. “You may have to get the part out of the bin, and then either drop it and re-grab it, or do some other kind of secondary operation,” he says, to prepare the part for precise placement into a target position by the robot arm. “That may not be the most elegant way to do it, but it gets the job done in a reliable way.”

“In a plant, all people care about is can you pick it up, and can you get it on that station every X seconds,” Shafi continues. “So we’d rather be a little less elegant, but completely reliable. We are very careful to make sure that all of these systems succeed, and that whatever we give a customer is something that will run, run, run, and instill confidence.”

Shafi Inc. installed an early bin-picking system at a customer site in 2003, says the president, and has since proven and demonstrated around two dozen different part geometries for random and semi-structured bin-picking, about six of which are being used at customer sites. The company maintains a strategic relationship with Cognex for vision technology, and Shafi software supports 15 robot families and 22 separate controllers.

同时为随机机器人bin-pickin总周期时间g depends on the robot process, and how far the arm must move during a given application, vision recognition time is key, Shafi says. “In order to be viable in the real world for bin-picking, you’ve got to have vision recognition completed in two, or maybe three seconds at the most.”

Shafi says his company is hitting that goal today for parts with simple geometries, down from four to six seconds 18 months ago. The gains are due largely to faster processors and improvements in vision algorithms, he says. And the emergence of 64-bit PCs this year and in 2007 will be a “huge facilitator in taking bin-picking technology from emerging to mature.”

Trick bag

在某种程度上,成功bin-picking”下来to how many tricks you have in the bag,” Shafi says. Depending on part geometries and application requirements, Shafi has developed bin-picking systems that rely on either fixed mount overhead cameras, arm-mounted cameras and hybrid combinations of both, as well as those that use laser-based structured lighting or laser dots.

Other techniques include switched lighting, in which lighting sources are fired as needed based on where a part to be picked is located within a bin—say, in a corner or near a sidewall. Yet another technique involves taking two or three images of a part very rapidly, each with different lighting, as an aid to finding its 3D position. This can be particularly helpful when parts are partially occluded, Shafi says.

If an application allows it, Shafi observes, a fixed-mount overhead camera solution is usually most desirable. After a part has been picked, the vision system can be working on the next part to be picked while the robot arm is placing the part. Secondly, a moving arm-mounted camera risks collisions with bin walls or parts, for example, which is not a concern with a fixed-mount camera, Shafi points out.

One company that has done extensive experimentation with bin-picking is American Axle & Manufacturing Inc., a Detroit-based maker of driveline and chassis systems. Over the past 18 months, the company has worked with Shafi on several potential bin-picking applications—both random and semi-random—involving parts such as forged axle shafts, cast differential carrier housings and forged pinions, says Dan Bickersteth, American Axle corporate manager for cycle time improvement and automation.

Already viable

So far, the company has chosen not to purchase and install any of the bin-picking projects developed, but not because they didn’t work, Bickersteth says. “I’m becoming more and more confident that bin-picking is going to be viable, and in certain applications, it’s already viable,” he declares.

Some bin-picking applications could produce a simple return-on-investment for American Axle within about two years, Bickersteth estimates. But for now, he indicates, that’s not quite good enough—given other automation opportunities that have a 12- to 18-month payback potential.

“Right now, we’re pursuing a set of projects that don’t involve vision. They’re much more straightforward automation, so we’re kind of focused on those,” Bickersteth explains. “But I think that these bin-picking applications will come to be,” he adds. “It’s just that they’re probably a year, or maybe a year-and-a-half down my list.”

For more information, search keywords “vision-guided robotics”

at www.myenum.com.

See sidebar to this article: Single-Camera Bin-Picking

More in Control