How robots are helping create drought-resistant crops

How robots are helping create drought-resistant crops

A robotics project in Missouri, USA is looking to tackle world hunger by using autonomous vehicles to collect data that will aid the development of drought-resistant crops.

For every story on how robots threaten the future of humanity, there’s scope for another that shows how they will save us. The fact is, like any technology, the humanitarian benefits and value of robotics to future generations will stem from our choices around their application and the regulations that govern them.

For now though, we can take heart in research at the Vision-Guided and Intelligent Robotics (ViGIR) Laboratory, undertaken in partnership with the University of Missouri in the US and other scientific bodies.

Population increases, climate change, the loss of arable land, pests and disease all play their part in undermining the world’s food supply. The internet of things (IoT) is rising to the occasion and striving to meet global demand but the effect of drought on many regions of the world remains potentially devastating.

Read more: Real-time disease monitoring unearths power of IoT in agriculture

Laying the groundwork for drought-resistant crops

What started as a collaboration with the College of Agriculture, Food and Natural Resources, creating 3D images of root growth in the laboratory, has flourished into the development of robotics that is helping to create similar images of corn shoots out in the field.

This new robotic architecture for plant phenotyping (an organism’s observable physical or biochemical characteristics) consists of two platforms – an autonomous ground vehicle, known as Vinobot, and a mobile observation tower, or Vinoculer.

As the ground vehicle navigates crop rows, collecting data from individual plants, the tower oversees a 60ft radius of the surrounding field, identifying specific plants for the Vinobot to inspect.

The Vinobot, meanwhile, has multiple sensors and a robotic arm to collect temperature, humidity and light intensity at three different heights on the corn plant. This allows it to assess growth, development, yield and other aspects, such as tolerance and resistance to environmental stressors, by correlating these to the physiology of the corn shoots.

Read more: Italian start-up Evja launches smart agriculture platform for salad growers

The importance of autonomous crop phenotyping

The neat teamwork between Vinobot and Vinoculer has a threefold advantage. Firstly, the system can assess large areas of a field at any time, night or day, while identifying biotic or abiotic stresses in individual regions.

Secondly, this can be focussed to allow high-throughput plant phenotyping, with either selective or comprehensive data acquisition – from groups or individual plants. And finally, the method does away with the need for the expensive aerial vehicles or confined field platforms that are commonly used today. The research’s report claims, the proposed system is cost effective, reliable, versatile and extendable.

Most significantly, the use of 3D models supplied by the robots expands the traditional measurements of leaf angels, areas and number of leaves to enable the potential discovery of new traits. This could provide the means to give scientists the data needed to develop new genotypes of drought-resistant crops.

Read more: Tarzan robot swings above crops for automated agriculture

The post How robots are helping create drought-resistant crops appeared first on Internet of Business.

Internet of Business

LG set to unveil new line-up of commercial robots at CES

LG set to unveil new line-up of commercial robots at CES

South Korean tech giant LG has announced it will unveil a trio of new robots at the 2018 Consumer Electronics Show in Las Vegas next week. 

Last year, LG launched several robots, and in 2018, it intends to expand its line-up with three new ‘concept robot’ models specifically developed for use in hotels, airports and supermarkets.

These new robots are the Serving Robot, Porter Robot and Shopping Cart Robot and all fall under LG’s CLOi brand (pronounced ‘Cloh-ee’). They join LG’s Airport Guide Robot and Airport Cleaning Robot that have been successfully trialled at Incheon International Airport, which serves the Korean capital of Seoul. 

LG claims it has also had success with the Lawn Mowing Robot and the affable Hub Robot, which have likewise taken part in trials at an unnamed Korean financial organisation, providing information and servicing to customers.

Read more: Harrison Manufacturing deploys Sawyer robot to increase throughput

High-tech helpers

The Serving Robot is one of LG’s most practical robots, according to the company, delivering meals and drinks to guests and customers in hotels and airport lounges “quickly and efficiently”.

The Porter Robot also aims to boost efficiency, in this case by delivering luggage straight to guest’s hotel rooms and thus minimising “the inconvenience that may result from slow service and long wait times during a hotel stay.” It also offers express check-in and check-out services and can take customer payments. 

Along with airports and hotels, LG is also looking to improve the customer experience at supermarkets with the Shopping Cart Robot. This will enable customers to scan items and view prices on the robot’s display. It will also help them locate items. 

The CLOi robots, LG said, will be “developed in parallel to ThinQ products, LG’s AI brand for consumer electronics and home appliances”. This suggests that the company has plans for these robots to ‘learn’ more tasks over time. 

Read more: San Francisco curbs sidewalk-hogging delivery robots

The post LG set to unveil new line-up of commercial robots at CES appeared first on Internet of Business.

Internet of Business

Cornell engineers program tiny robots to react like insects

cornell university insect robots, robobee

Researchers at Cornell University, New York, are developing tiny, insect-inspired robots that don’t just look like the real thing. They think like it too. 

Taking inspiration from nature and mimicking it to the extreme are two very different challenges. Engineers designing intricate robots often seek to replicate the way that animals move, from Boston Dynamics to EPFL’s electronic, pollution-detecting eel.

A more complicated hurdle is copying the way that animals and insects think and process information. Overcoming that could eventually reduce payloads, free space for more computation and make tiny, insect-like robots a lot more convincing.

Researchers at Cornell University are doing exactly that with RoboBees, 80-milligram robotic insects manufactured by the Harvard Microrobotics Lab. With a wingspan of just 3cm, they offer the ideal base unit for new programming that could help them react and adapt to the world like the creatures they were inspired by.

Read more: Origami-inspired artificial muscles give robots superhuman strength

Neuromorphic computer chips offer ‘event-based’ processing

These new developments are enabled by neuromorphic computer chips, which process spikes of electrical current rather than binary code made up of 0s and 1s. These complex electrical combinations work in a similar fashion to how neurons fire inside a brain.

Silvia Ferrari, a professor of mechanical and aerospace engineering and director of Cornell’s Laboratory for Intelligent Systems and Controls, has suggested that neuromorphic computer chips could lessen the need for the dense computers that usually form a robot’s payload.

The Cornell lab is developing ‘event-based’ sensing and control algorithms that mimic neural activity in response to external stimuli. They are being tested with RoboBees.

“Getting hit by a wind gust or a swinging door would cause these small robots to lose control. We’re developing sensors and algorithms to allow RoboBee to avoid the crash, or if crashing, survive and still fly,” said Ferrari.

“You can’t really rely on prior modeling of the robot to do this, so we want to develop learning controllers that can adapt to any situation.”

Read more: Tiny robot biohybrids could help treat cancer

Removing weight from the equation

As part of the project, the RoboBees have been outfitted with vision, optical flow and motion sensors. The ambition is to soon remove the need for a tethered power source thanks to the use of “event-based” sensing. Cornell’s algorithms could allow RoboBee and similar small robots to become more autonomous and adaptable without being weighed down by bulky power sources.

“We’re using RoboBee as a benchmark robot because it’s so challenging, but we think other robots that are already untethered would greatly benefit from this development because they have the same issues in terms of power,” said Ferrari.

Read more: Origami-inspired artificial muscles give robots superhuman strength

The post Cornell engineers program tiny robots to react like insects appeared first on Internet of Business.

Internet of Business

San Francisco curbs sidewalk-hogging delivery robots

San Francisco curbs sidewalk-hogging delivery robots

Regulations limiting the number of delivery robots permitted on sidewalks have been passed, as activists push back against the rise of delivery automation startups.

When a world-renowned centre of innovation such as San Francisco faces opposition to disruption, it’s a sharp reminder of the need for regulations to keep pace with technological advancements. Washington DC, Virginia and Idaho already have laws permitting delivery robots to operate, but San Francisco has been slower to legislate for the technology, despite Silicon Valley’s proximity.

We’ve seen this sort of regulatory scrambling in response to the emergence of Uber and Airbnb. Delivery robots are the latest targets to face the scrutiny of the law. Companies such as Marble and Starship are springing up to offer “robots as a service”, with business models that enable food retailers to contract out delivery to automated alternatives.

Read more: Hermes and Starship Technologies to test delivery robots in London

What regulation means for delivery robots

San Francisco supervisor Norman Yee, who authored the legislation, originally pursued a ban on such robots, arguing that the city’s streets “are for people, not robots”. However, this absolute approach was relaxed in October to look at regulation instead.

“Not every innovation is all that great for society,” said Yee, addressing the city’s Board of Supervisors. “If we don’t value our society, if we don’t value getting the chance to go the store without being run over by a robot… What is happening?”

“When it comes to being proactive about the development of common-sense regulations for commuter shuttles or the sharing economy, such as Airbnb or Uber, somehow we have sent the signal that it is acceptable to act now and ask for forgiveness later,” Yee said at a public hearing on the legislation in October. “That is not an example of a city that leads.”

Following the legislation, each company will be limited to three robot permits, and only nine will be available at any time for the entire city. The robots will also be restricted to certain industrial areas, 6-feet wide sidewalks and must be chaperoned at all times. On top of this, a maximum speed of 3 miles per hour will be imposed. Together, these rules will severely restrict the ambitions of the likes of Marble and Starship.

The San Francisco Chamber of Commerce lobbied against an all-out ban of such robots, saying it “could create a massive barrier to future innovation in the industry”. Other proponents argued that robots could play an important part in delivering medicines to elderly members of the community who are unable to get to a pharmacy.

Read more: Ford, Domino’s Pizza to trial self-driving delivery service in US

Guiding innovation and disruption

Nonetheless, the overcrowding of sidewalks is undeniably an issue. They simply weren’t designed to accommodate robots too, and activists stressed the risk delivery robots pose to the elderly and disabled.

“Sidewalks, I believe, are not playgrounds for the new remote-controlled toys of the clever to make money and eliminate jobs,” said Lorraine Petty, an activist with the community group Senior and Disability Action, the Guardian reports. “They’re for us to walk.”

In the long term, some cities may start building designated lanes for robots. For many though, the issue goes beyond the monopolizing of sidewalks. “Are robots necessary?” asked San Francisco resident Lori Liederman. “Maybe it isn’t just our safety that’s in jeopardy. Maybe it’s our humanity as well.”

Philosophy aside, like most automation, delivery robots have the potential to both harm and benefit society. It comes down to scale, what they’re delivering and why. Often, we’d benefit from fetching food ourselves or using a human delivery service but in other cases robot deliveries could augment existing services, or provide an affordable lifeline to isolated individuals.

This scope for benefit and harm is common to disruption and the hopes, fears and unknowns that come with it. In the face of such, there is a very real need for regulation to protect the wider interests of society, while leaving space to innovate and grow. It’s a difficult balance, that encompasses technical, social, legal and economic considerations.

Read more: JD.com launches robot delivery in China

The post San Francisco curbs sidewalk-hogging delivery robots appeared first on Internet of Business.

Internet of Business

Ocado’s robots offer ‘safe pair of hands’ for packing shopping

Ocado develops new robot for warehouse packing, using 3d vision system

Online retailer Ocado has developed a robotic system capable of packing bags with all the care and dexterity of a human. 

UK-based online grocery retailer Ocado has for some years been at the forefront of artificial intelligence (AI) and robotics from a retail perspective, thanks to the work of its Ocado Technology unit.

As well as developing in-house robotic solutions for its customer fulfilment centers, the team is participating in two EU-funded Horizon 2020 projects, with a view to devising smarter warehouse robots: SoMa and SecondHands.

This week, Ocado has unveiled a new robotics solution that uses suction, computer vision and sensors to pack shopping items safely into bags.

Read more: Online-only retailer Ocado trials robotic arm to speed up orders

Demand for smarter warehouse robots

Picking up and packing groceries is a routine task for humans, but it’s full of complicated decisions not easily replicated by an industrial robot. In a fast-moving fulfilment center handling an average of 260,000 orders per week, any automated system has to be able to think on its feet – so to speak.

Ocado deals with a huge range of products. In a high-speed game of retail Tetris, a successful packing robot has to take into account each item’s weight, shape and orientation when placing it among other objects.

The retailer’s latest solution is “conceptually simple“; its main features are a suction cup on the end of an articulated arm. But beneath the surface, there’s much more going on. The algorithm controlling the robot must have a grasp of more than just the items in front of it. It needs to understand the difference between the green crates that come from the storage department and the red ones that are designated for customer delivery, for starters.

It also needs to know and locate the “optimal grasp point” of every item within the crate. At the same time, it has to scan the delivery crate for free space and place multiple products with care, and often the right way up, too.

Read more: Ocado launches Alexa app for voice-activated online shopping

How 3D vision systems bridge the gap

Instead of taking on the monumental task of modelling every single item in the product catalogue, Ocado Technology developed a 3D vision system capable of identifying the best grasp points of each item. The suction cup is lowered down to the selected point and uses a vacuum cup to carefully pick it up and move it to the delivery crate.

While that is happening, the system confirms that the product is the correct one. It then works out the best orientation before placing it into the bag.

To make sure products aren’t damaged during the process, sensors built into the robotic arm consider the weight at hand.

In a blog post, Ocado Technology points out how much time and effort was saved by devising the 3D vision system. “The fact we found a way to bypass modelling our SKUs also meant that we could pick a greater range of items than many industrial picking systems. All in all, the system is streamlined and flexible, and our robotics team are very proud of the progress they have made so far.”

Read more: Ocado trials driverless CargoPod for last-mile grocery deliveries

The post Ocado’s robots offer ‘safe pair of hands’ for packing shopping appeared first on Internet of Business.

Internet of Business