Tragically underrated comedy writer Erma Bombeck once wrote that, “Some of the best fiction writers got their start writing airline menus.”
Most people—and certainly most underrated comedy writers—agree that airline food is bad. Luckily a food delivery app wants to make sure that you never fly hungry again and won’t be tempted to eat whatever it is the airlines are serving. AtYourGate is a food delivery service that will bring you food while you loom around the boarding gate, even though you’re sitting in Row 44. The delivery team doesn’t have to bring your loaded baked potato through security, though, because for better or worse, the app only lets users choose from options within the airport. To use it, just download the app, browse the options, make your selection, and they will run it over to you, usually within 20-30 minutes.
AtYourGate has just announced it is landing in the New York area, making it easier for passengers flying through Newark Liberty International, LaGuardia, and Terminal 7 of John F. Kennedy International to decide whether it’s worth paying someone to bring them an Auntie Anne’s pretzel. (New York and New Jersey airport travelers who download the app qualify for free delivery on their first purchase by entering the promotion code AYGFREE. There is no minimum order requirement. )
The app is still pretty new, with only New York City, San Diego, and Minneapolis St. Paul airports onboard, but if it rolls out nationwide, folks traveling through Portland International Airport (hello, Country Cat) or O’Hare (where Rick Bayless has a torta restaurant) may never eat airplane food again.
Facial recognition technology has progressed to a point where it now interprets emotions in facial expressions. This type of analysis is increasingly used in daily life. For example, companies can use facial recognition software to help with hiring decisions. Other programs scan the faces in crowds to identify threats to public safety. Unfortunately, this technology struggles to interpret the emotions of black faces. My new study, published last month, shows emotional analysis technology assigns more negative emotions to black men’s faces than white men’s faces. This isn’t the first time that facial recognition programs have been shown to be biased.… This story continues at The Next Web