Launch HN: Azalea Robotics (YC S24) – Baggage-handling robots for airports
Unedited autonomous ops: https://www.youtube.com/watch?v=DuJ3ZORnO1o
Teleoperated (sped up, so no sound): https://www.youtube.com/watch?v=LeK8NQLnYgA
The marketing version: https://www.youtube.com/watch?v=k0SDPm09U6s
Robotics is in an interesting place right now, with many warehouse automation companies humming along for almost a decade, and a lot of new effort going to full general purpose hardware with humanoids and software via generalist robotics foundation models. We love these efforts (David used to work on one at Google X with Everyday Robots), but we also see a lot of utility in the current wave of robotics planning and perception tech that can enable new use cases today.
Airlines in the US compete primarily on efficiency and customer loyalty, and baggage handling hits both (John B. has first-hand experience from working on baggage optimization projects at United Airlines). 2% of flights are delayed by baggage errors, leading to downstream network delays. Baggage handling is also a major complaint in customer experience—almost everyone has a horror story of a missing bag, and sometimes people vow never to fly an airline again for losing their belongings. Furthermore, it’s a really dangerous job for employees from a repetitive stress standpoint. EU regulation is coming to reflect this, protecting workers with a maximum number of bags transferred per shift to alleviate back and tendon injuries that are inherent to this job.
Unfortunately for airlines, passengers don’t package their luggage in nicely uniform cardboard boxes. If they did, then the airlines could benefit directly from the recent takeoff in manipulator tech for warehouses. But airline luggage is way more wacky and irregular. If robots are going to handle it, they need to reason about how to grasp each item, handle its deformability, stack it in a stable way, and do all of this quickly, safely, and reliably.
This is what we’re tackling at Azalea. We’re bringing our expertise in deformable object manipulation, perception, robot learning, and planning, to this logistical problem.
We have a few strong bets behind what we’re working on: (1) The hardware to solve this problem has been available or manufacturable for decades, what’s been missing is perception, planning, and control. (2) Cobots, robots designed to operate alongside humans, aren’t enough for safety. To do this task efficiently, you need to move up to 50 kg bags very quickly, which can be dangerous no matter how well the cobots are designed. Light curtains (arrays of lasers that stop a machine when interrupted) and machine cages are the current industrial standard and remain the way to go. (3) Software for generalist robots needs more data than most people today believe, and it will be at least 15 years before deployment: we should focus on specialized problems of economic value.
Our core technical developments are in a few areas:
– Grasp synthesis and selection: From visual data only, how can we identify good candidate grasp points and rank them? For this, we use a mix of physical reasoning, heuristics, and a lot of learning from previous data, combined in a single objective function. Furthermore, success must be evaluated as both a successful grasp and continual hold throughout the transfer.
– Placement planning: How do we lay out luggage in the module we’re loading? There’s a nice ramp-up in difficulty for this problem, from open-loop “divide the world into a grid” approaches, to 3d bin-packing optimization, to reinforcement learning. An interesting aspect of this problem for us is that the bags should be physically stable when the cart starts driving, and lighter, deformable objects shouldn’t be underneath heavy, hard objects. We use a similar mix of physics and learning to model this problem.
– Fast collision-free planning: Off the shelf planners work great for the most part but can fail in heavily cluttered areas or dynamic scenes. We leverage the fact that we’re always solving a series of similar problems to provide initial guesses for downstream trajectory optimization algorithms. Since each problem is so similar, we can use techniques similar to generative models to propose these initial plans.
– Mechanical design: The perfect tool to pick up everything checked down a conveyor belt isn’t an easy thing to design. We’re building tools with multiple modes of grasping to handle wide varieties of objects. The videos we linked to are all with suction only – which can be surprisingly powerful! An interesting aspect of autonomy becomes choosing which mode to use when, and how to use it.
These problems can be deeply interlinked: where you grasp an object depends on what your tooling looks like and informs where you can put it– so a perfect solution would jointly reason about both problems simultaneously. We’re looking forward to getting there as we collect more data and continue our efforts.
Check out our demo videos above! We have a brand new hardware stack coming soon (and we’ve added a new end effector that we’re keeping hush), but it’s amazing what you can do with pure suction.
We’re proud of our progress so far but would love to hear your thoughts and feedback. Let us know if you’ve had a particularly bad baggage horror story and/or have personal experience with the industry.