Hexbyte  Tech News  Wired Your Next Game Night Partner? A Computer

Hexbyte Tech News Wired Your Next Game Night Partner? A Computer

Hexbyte Tech News Wired

Hexbyte  Tech News  Wired

This image from Iconary displayed a car, several tools, and a doorway.

Iconary

When the arrow appeared next to the birdcage, I finally understood what my partner was trying to say.

The game was a clone of Pictionary—I had to guess the phrase based on a drawing. My partner had initially depicted a duck next to a cage, plus a hand, and a pond. Only after I asked for another drawing and the arrow was added did I realize the hand was “releasing” the duck, not feeding it. “You win!!!” I was told, after typing in the full answer.

There was no prize, but it felt good. My partner felt nothing—because it was a bot. Despite our mutually incompatible hardware and wetware, we’d found shared meaning, of a kind, in a tangle of pixels and characters.

The game, named Iconary, is now available for anyone to play online. If you play, you’ll be contributing to a research project trying to make computers better coworkers and collaborators. You don’t have to play long to see that the bot needs the help. How showing me a cat superimposed with a crucifix was supposed to help signify “laughing in the yard,” is unclear.

Getting computers and humans to play games together isn’t new. It’s been an obsession of some artificial intelligence researchers since the field’s early days in the 1950s. Yet that history is largely one of conquest by machines. Over the decades, computers have progressively beaten human champions at checkers, chess, Go, and in December, the popular strategy videogame StarCraft II.

“You might think AI stands for antagonistic intelligence,” says Oren Etizioni, CEO of the Allen Institute for AI, the independent research lab in Seattle that created Iconary, and the bot that plays it, named AllenAI. The project aims to explore lessons when people and software play a game that involves collaboration, not a zero-sum fight for dominance.

Iconary works like Pictionary: A player must get a teammate to guess a word or phrase solely by drawing. The name differs in part because Mattel has a trademark on Pictionary, and also because the “draw-er” builds an image from a series of icons. Human players generate those icons piecemeal, drawing items freehand with a mouse, and then selecting from icons the computer suggests. To depict “fruit salad,” you might draw and then select icons for bananas, lemon, apple, and a knife.

“It’s kind of fun,” says David Forsyth, a professor who researches AI at the University of Illinois at Urbana-Champaign, after receiving a preview of Iconary this week. Like WIRED, he found that AllenAI is in general a better guesser than drawer. But that a bot can play a game involving remixing visual concepts and converting them into language is notable, he says.

Recent advances in machine learning have made computers pretty good at recognizing specific objects, if they’ve been trained to look for them. That’s good for searching your phone’s camera roll for cats, or a crowd for a specific face. Reading—or creating—higher-level meaning from a combination of visual concepts is much more challenging, Forsyth says.

The bot you’ll play Iconary with was created by applying machine learning algorithms to records of more than 100,000 Iconary games played by humans, drawing and guessing around 75,000 different phrases. The researchers drew on a recent leap in the capability of software that extracts meaning from text, brought about in part by research at the Allen Institute for AI.

The phrases used in the online game were not part of the software’s prior training. Etzioni says his team believes gathering new data from the bot’s encounters with humans will help researchers improve the software’s ability to understand images, text, and the ways people use them.

Eventually, they hope to use Iconary to stage a version of the Turing Test, a way to probe a computer’s intelligence proposed by British mathematician Alan Turing in 1950. In Turing’s version, a person converses with an unseen party via text and must guess whether or not that interlocutor is human. Etzioni’s version would see people play Iconary with an unknown teammate, and then guess if it was a person, or the AllenAI bot.

That achievement would be a nice moment of cross-species collaboration to place alongside earlier AI milestones in which successive human champions have been roundly defeated at a game they love by software. Algorithms that could do that might also have potential in more practical situations.

Etzioni believes that algorithms able to work with humans to remix verbal and graphical concepts could help compose business documents, or on creative projects. Forsyth says that software able understand novel combinations of imagery could help computers venture out into the messiness of the real world. For example, complex domestic robots will need to extract meaning from endless novel combinations of household items in order to function reliably in homes, he says.


More Great WIRED Stories

Read More

Hexbyte  Hacker News  Computers Google’s Night Sight for Pixel phones will amaze you

Hexbyte Hacker News Computers Google’s Night Sight for Pixel phones will amaze you

Hexbyte Hacker News Computers

Google’s Pixel phones have already changed and improved smartphone photography dramatically, but the latest addition to them might be the biggest leap forward yet. Night Sight is the next evolution of Google’s computational photography, combining machine learning, clever algorithms, and up to four seconds of exposure to generate shockingly good low-light images. I’ve tried it ahead of its upcoming release, courtesy of a camera app tweak released by XDA Developers user cstark27, and the results are nothing short of amazing. Even in its pre-official state before Google is officially happy enough to ship it, this new night mode makes any Pixel phone that uses it the best low-light camera.

Let’s take a look at a few examples, shall we? All of the shots below are taken with the Pixel 3 XL: first with the default settings and second with the night mode toggled on. Google claims Night Sight will save you from ever having to use the flash again, and so naturally, I didn’t use it with any of these images.

If you listen closely, you might be able to hear every other phone camera engineer flipping their desk and resigning in disgust. This is just an astonishing improvement. The Pixel 3’s camera is already among the very best low-light performers, so when a scene is so dark that it barely registers anything, you know there’s hardly any light. And yet, with Night Sight on, we actually see a scene that looks like a moderately noisy daytime shot.

Less dramatic than the difference between the first pair of images, this comparison represents a step change from crummy, noisy, and unusable to a perfectly decent shot.

This is easily my favorite comparison because the differences are so obvious that they scarcely need analysis. The default Pixel shot actually does an admirable job — most other phones would smudge the text to smithereens in such challenging conditions — but the night mode completely overhauls the photo. Google says that its machine learning detects what objects are in the frame, and the camera is smart enough to know what color they are supposed to have. That’s part of what makes these reds pop so beautifully.

Say hello to Vlad circa 1990. You’ll notice that all of the photos in this set are indoors, and that’s because, well, I’ve just gotten this camera APK running today, and it’s not night yet. We’ll get some outdoor night shots into this article as soon as we can, and then we’ll follow up with a fuller review of Night Sight when Google officially releases it.

Of course, Night Sight isn’t a perfect solution to every low-light situation. If you have small but focused sources of bright light, this night mode will blow out some highlights. Even so, an actual long exposure would probably blow out the entire screen on the Pixel 3, whereas the night mode shot keeps it readable. The fact that I can casually do this at my desk, without any tripod or specialist equipment (like, say, a dedicated camera), is bewildering.

The first shot is how I usually appear on Verge team conference calls. The second is just Google’s inexplicable camera magic.

I love this old painting that’s hanging above my computer. It looks terrible with the regular Pixel camera, but the night mode version restores it beautifully. All the cracks in the surface of the wood show up, replacing the nasty blotchiness of the other photo, and even the wispy curls of hair on the girl’s forehead are discernible. If I were to take the same shot with the lights turned on, I’d get an undesirable highlight on the painting.

I have to keep saying it because it’s important: the dark photos in these comparisons are probably the best you can get from a smartphone camera today. And yet the Pixel’s night mode makes the default Pixel’s pictures look like they were taken with a phone from a decade ago. The color fidelity and sharpness of these Night Sight images are not things I can understand or explain. There’s simply no reference point for this kind of imaging improvement through software.

Here’s a final comparison to drive home the point. The first photo is a sea of chromatic noise, smudged up by noise reduction blur working overtime to produce a somewhat reasonable image. All you’d ever get from that is an outline of the headphones. With night mode turned on, you can read the “Beyerdynamic” on the ear cup, you can see the “1” cutout next to the “T” in the yoke, and you can even read the last three digits of the serial number on the inside on the left of the headband.

All of this amazingness is coming from pre-release software. You can download it for your Pixel device and play around with it as I have and lose your shit at the unreal results it produces. Google is evidently close to releasing the final Night Sight addition to the Pixel camera app, and when it does, it’ll probably change the mobile photography game.

Photography by Vlad Savov / The Verge

Read More