Connect with us

General

Dexterous MIT Robot Picks Up, Puts Down Objects It’s Never Seen Before

Advertisement
Samsung J7 V just $5 mo. New device payment purchase req'd. Plus, free shipping.

Ask a human to hang a mug by its handle on a hook, and they won’t hesitate.

Ask a robot to carry out the same task, and you’ll be waiting a long while.

Unless you’re at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), where researchers developed an automated system of pick-and-place tasks.

“Whenever you see a robot video on YouTube, you should watch carefully for what the robot is NOT doing,” MIT professor Russ Tedrake, senior author of a paper about the project, said in a statement. “Robots can pick almost anything up, but if it’s an object they haven’t seen before, they can’t actually put it down in any meaningful way.”

[embedded content]

The two most common approaches to picking up objects are “pose-based” systems that estimate an object’s position and orientation, and general grasping algorithms.

But these methods, according to MIT, are flawed: Pose estimators often don’t work with objects of significantly different shapes, and grasping approaches can’t place objects with much subtlety.

In contrast, CSAIL’s kPAM (Keypoint Affordance Manipulation) approach detects a collection of coordinates (keypoints) on an object—establishing a sort of visual roadmap.

Keypoints naturally handle variation among a particular type of object, like a mug or shoe; these coordinates provide all necessary information for a robot to determine what to do with an item.

Trial and error (via MIT Computer Science and Artificial Intelligence Laboratory)

“Understanding just a little bit more about the object—the location of a few key points—is enough to enable a wide range of useful manipulation tasks,” Tedrake said.

In the case of a mug, the system requires only three keypoints—the center of the mug’s side, bottom, and handle. For a shoe, kPAM needs six points to pick up more than 20 different pairs of footwear, from slippers to boots.

“This particular representation works magically well with today’s state-of-the-art machine learning perception and planning algorithms,” Tedrake added.

Moving forward, the team hopes to teach their system to perform tasks with even greater generalizability—like unloading a dishwasher or wiping kitchen counters.

More on Geek.com:

SOURCE

Advertisement
Nutritional Food Supplements - Great Discounts and Offers on Huge Range of Nutritional Food Supplements
MMake their morning magical with wow-worthy gifts.

General

Adobe’s New AI Tool Can Identify Photoshopped Faces

Advertisement
Code: VZWDEAL. Enter this coupon code at checkout to get $100 discount on Samsung Galaxy Note 8. Includes free shipping. Restrictions may apply. Device payment purchase required.

The Internet cannot be trusted: Between doctored photos and deepfaked videos, there’s just no telling what is fact and fiction.

In an effort to regulate the digital Wild West it helped usher in 30 years ago, Photoshop maker Adobe developed a new tool for identifying altered images.

Researchers Richard Zhang and Oliver Wang—along with UC Berkeley collaborators Sheng-Yu Wang, Andrew Owens, and Alexei Efros—created a method for detecting edits made using Photoshop’s Face Aware Liquify filter.

The function automatically distinguishes facial features, making it easy to adjust eye size, nose height, smile width, and face shape.

Popular with photographers who didn’t quite capture the expression they wanted, the feature’s delicate effects “made it an intriguing test case for detecting both drastic and subtle alterations to faces,” an Adobe blog post said.

“While we are proud of the impact that Photoshop and Adobe’s other creative tools have made on the world, we also recognize the ethical implications of our technology,” the company wrote.

“Trust in what we see is increasingly important in a world where image editing has become ubiquitous,” it continued. “Fake content is a serious and increasingly pressing issue.”

[embedded content]

With that in mind, Adobe partnered with the University of California, Berkeley, as part of a broader effort to better expose image, video, audio, and document manipulations.

Using pictures scraped from the Internet—as well as some modified by a human artist—the team trained a Convolutional Neural Network (CNN) to recognize altered images of faces.

“We started by showing image pairs (an original and an alteration) to people who knew that one of the faces was altered,” Oliver Wang said in a statement. “For this approach to be useful, it should be able to perform significantly better than the human eye at identifying edited faces.”

Spoiler alert: It does.

Flesh-and-blood people were able to ID the revised face 53 percent of the time (slightly better than chance), the neural network achieved results as high as 99 percent.

The tool also pinpointed specific areas and methods of facial warping, and was able to revert images to what it estimated was their original state. The results, according to Adobe, impressed “even the researchers.”

Adobe’s new tool was nearly twice as good at identifying manipulated images as humans (via Adobe/UC Berkeley)

“It might sound impossible because there are so many variations of facial geometry possible,” UC Berkeley professor Efros said. ‘But, in this case, because deep learning can look at a combination of low-level image data, such as warping artifacts, as well as higher level cues such as layout, it seems to work.”

This isn’t the end of fake news just yet, though.

“The idea of a magic universal ‘undo’ button to revert image edits is still far from reality,” Zhang admitted, bursting our collective bubble. “But we live in a world where it’s becoming harder to trust the digital information we consume, and I look forward to further exploring this area of research.”

“This is an important step in being able to detect certain types of image editing, and the undo capability works surprisingly well,” Gavin Miller, head of Adobe Research, added.

“Beyond technologies like this,” he said, “the best defense will be a sophisticated public who know that content can be manipulated—often to delight them, but sometimes to mislead them.”

More on Geek.com:

SOURCE

Advertisement
Tus hijos podrán dormir en tu misma habitación sin coste adicional y disfrutar en la mayoría de nuestros destinos de alojamiento y desayuno gratis.
Laptop Power Adapters, Free Shipping!
Continue Reading

General

A Smart Speaker Could Save You From Cardiac Arrest

Advertisement
Samsung J7 V just $5 mo. New device payment purchase req'd. Plus, free shipping.

My fiancé is a deep sleeper: If I stopped breathing or began gasping for air in the middle of the night, he’d snore right through it.

But a smart speaker could save my life.

Researchers at the University of Washington developed a tool to monitor people for cardiac arrest when they’re asleep.

Nearly 500,000 Americans die each year from sudden heart failure—many in the comfort of their own bedroom.

A new skill for a smart speaker or phone, however, could detect the gasping sound of abnormal breathing and call for help.

On average, the proof-of-concept tool—trained on real agonal breathing instances captured from 911 calls—identified labored breathing patterns 97 percent of the time, from up to 20 feet away.

The findings were published this week in a Nature journal.

“A lot of people have smart speakers in their homes, and these devices have amazing capabilities that we can take advantage of,” study co-author Shyam Gollakota, an associate professor at UW, said in a statement.

Researchers envision a contactless system that works by continuously and passively monitoring the bedroom for an agonal breathing event and call for help (via Sarah McQuate/University of Washington)

Picture this: You’re enjoying a pleasant vision of shirtless Hugh Jackman serenading you on a yacht as it cruises through the canals of Venice. Suddenly, your heart stops pumping and blood flow ceases; you’re fighting for breath, making guttural gasping noises, and involuntarily twitching. Neither Hugh nor anyone else is there to perform first aid.

The smart speaker on the bookshelf, however, recognizes signs of agonal breathing and calls 911.

You’re alive and well to stalk Hugh Jackman live another day.

Researchers gathered more than 162 IRL 911 calls to Seattle’s Emergency Medical Services, collecting short recordings on different devices, including an Amazon smart speaker, iPhone 5s, and Samsung Galaxy S4.

They added interfering sounds like barking dogs, honking cars, and humming A/C units: “Things that you might normally hear in a home,” according to first author and UW doctoral student Justin Chan.

And accounted for distractions like snoring or obstructive sleep apnea.

“We don’t want to alert either emergency services or loved ones unnecessarily, so it’s important that we reduce our false-positive rate,” Chen said.

Moving forward, the team envisions the algorithm functioning like a mobile app or Alexa skill that runs in passively—in real time, so there’s no need to store data or send it to the cloud—while people sleep.

“Cardiac arrests are a very common way for people to die, and right now many of them can go unwitnessed,” co-author Jacon Sunshine, an assistant professor at the UW School of Medicine, said. “Part of what makes this technology so compelling is that it could help us catch more patients in time for them to be treated.”

More on Geek.com:

SOURCE

Advertisement
Samsung J7 V just $5 mo. New device payment purchase req'd. Plus, free shipping.
Click this link to get up to $100 discount on electric and infrared heaters. Restrictions may apply.
Continue Reading

General

Netflix Hacks Highlight What Could Be

Advertisement
Code: DecPC20. Promo code is required. The promotion is available for all customers of the store. Minimum order value is 1250$.

Netflix’s internal hackathons have gained a reputation for producing some genuinely great ideas—from quitting when Fitbit detects you’re asleep, to a floating desktop window that lets you stream content while you work.

And Hack Day 2019 did not disappoint.

Last month’s event focused on “studio efforts” (whatever that means).

“The goal remained the same: team up with new colleagues and have fun while learning, creating, and experimenting,” according to a company blog post. “We know even the silliest idea can spur something more.”

Project Rumble Pak

[embedded content]

Picture this: You’re watching an episode of Stranger Things. Eleven—a psychokinetic teenager—mind-pushes a table across the room. Your phone violently shakes with the action.

No matter how great a special effects team is, sight and sound alone can only take an audience so far. Which is why Hans van de Bruggen and Ed Barker developed Project Rumble Pak.

“The … Hack Day project explores how haptics can enhance the content you’re watching,” Netflix explained. “With every explosion, sword clank, and laser blast, you get force feedback to amp up the excitement.”

Barker and van de Bruggen focused on animated series Voltron, synchronizing video content with haptic effects using Immersion Corporation technology.

The Voice of Netflix

[embedded content]

If Netflix could speak, what would it sound like?

A weird amalgamation of characters and vocalizations, apparently.

Carenina Garcia Motion and Guy Cirino teamed up to create “The Voice of Netflix“: a neural network trained to spot words in streaming content and reassemble them into new sentences—on demand.

Type any sentence into the online portal, and listen for yourself.

The current vocabulary is limited to 2,340 words (with more to come), meaning the digital voice will simply skip over any unfamiliar terms—including “Netflix”—to create stunted, hilariously awkward sentences.

Not every Netflix Hack Day concept is directly related to the streaming platform.

This year, a team invented TerraVision—a new way for filmmakers to search for and discover locations. Just drop a photo of a particular look (a house, a park, a castle, a swimming pool) into an interface to find the closest matches.

Another dreamed up a clever way to clear out office conference rooms.

With the click of a button, Netflix employees the world over can nudge co-workers out of a meeting room after their session has ended.

The web app looks up calendar events associated with a given space and finds the latest session that should have ended, then automatically calls into the room, playing walk-off music (just like at the Oscars).

“The most important value of hack days is that they support a culture of innovation,” the Netflix TechBlog said. “We believe in this work, even if it never ships, and love to share the creativity and thought put into these ideas.”

More on Geek.com:

SOURCE

Advertisement
Code: VZWDEAL. Enter this coupon code at checkout to get $100 discount on Samsung Galaxy Note 8. Includes free shipping. Restrictions may apply. Device payment purchase required.
Click this link to get the Moto Z2 Force for just $31.50/month. Unlimited and device payment activation required. Includes free shipping. Restrictions may apply.
Continue Reading

Deals

Advertisement
Code: VZWDEAL. Enter this coupon code at checkout to get $100 discount on Samsung Galaxy Note 8. Includes free shipping. Restrictions may apply. Device payment purchase required.
Swimwear+45% off sitewide+From $12.99+Free shipping
Click this link to get 15% discount on in-store pick up orders over $100. Restrictions may apply.

Trending