Hexbyte  Tech News  Wired The House Science Committee May Soon Become… Pro-Science

Hexbyte Tech News Wired The House Science Committee May Soon Become… Pro-Science

Hexbyte Tech News Wired

Hexbyte  Tech News  Wired

Rep. Bill Foster, an Illinois Democrat and particle physicist, hopes that the House will finally start taking environmental science seriously.

Bill Clark/CQ Roll Call/Getty Images

Hexbyte  Tech News  Wired

Rep. Bill Foster, an Illinois Democrat and particle physicist, hopes that the House will finally start taking environmental science seriously.

Bill Clark/CQ Roll Call/Getty Images

For the past eight years, climate science has been under a sort of spell in the House of Representatives. Instead of trying to understand it better or even acknowledging some of the field’s current uncertainties, House Science Committee Chairman Rep. Lamar Smith (R-Texas) used his position to harass federal climate scientists with subpoenas while holding hearings on “Making the EPA Great Again” or whether “global warming theories are alarmist” and researchers are pursuing a “personal agenda.”

But Smith retired this year and Democrats won control of the House on Tuesday. Now some on Capitol Hill say that the anti-climate science spell may be broken.

Read More

Hexbyte  Tech News  Wired Midterm Elections 2018: Voters Chipped Away at Gerrymandering

Hexbyte Tech News Wired Midterm Elections 2018: Voters Chipped Away at Gerrymandering

Hexbyte Tech News Wired

Both Republicans and Democrats woke up Wednesday morning claiming victory in Tuesday’s midterms. Democrats patted themselves on the back for taking back the House of Representatives and for flipping seven governorships from red to blue. And in a press conference, President Donald Trump praised his party, and himself, for gaining ground in the Senate.

Americans remain sharply divided at the ballot box, from which political party they support to initiatives on issues like climate change. But they consistently voted against one thing on Tuesday: gerrymandering. In Michigan, Missouri, and Colorado, voters overwhelmingly passed ballot initiatives to put an end to this practice; as of Wednesday evening, another in Utah held onto a slight lead. They join another initiative passed in Ohio earlier this year.

Read More

Hexbyte  Tech News  Wired San Francisco Sets a New Tech Trend: Tax the Companies

Hexbyte Tech News Wired San Francisco Sets a New Tech Trend: Tax the Companies

Hexbyte Tech News Wired

Hexbyte  Tech News  Wired

Salesforce CEO Marc Benioff was a major proponent, and big financial backer, of Proposition C.

Aflo Co. Ltd./Alamy

Hexbyte  Tech News  Wired

Salesforce CEO Marc Benioff was a major proponent, and big financial backer, of Proposition C.

Aflo Co. Ltd./Alamy

There is perhaps no greater example of Silicon Valley’s soft power than watching a debate around a grassroots proposal to fight homelessness transform into a Twitter war between tech billionaires and their preferred form of taxation. Rob Reich, author of the new book Just Giving: Why Philanthropy Is Failing Democracy and How It Can Do Better, called the drama around San Francisco’s Proposition C just as much a sign of our absurd times as New York governor Andrew Cuomo’s offer to rename himself “Amazon Cuomo” if it would inspire Jeff Bezos to put Amazon’s second headquarters in New York.

Even in the midst of a tech backlash, “the deference to economic power is still strong” among elected officials, says Reich, a Stanford political science professor. But on Tuesday, voters most familiar with Big Tech’s longterm impact refused to capitulate. Nearly 60 percent of San Francisco voters supported Prop C, which is projected to double the city’s budget for homeless services, raising an additional $300 million a year through a gross receipts tax on roughly 400 companies, including Square and Salesforce, whose billionaire CEOs bickered online about the initiative.

Read More

Hexbyte  Tech News  Wired Photos: New Yorkers Wait ‘on Line’ to Vote in Midterms

Hexbyte Tech News Wired Photos: New Yorkers Wait ‘on Line’ to Vote in Midterms

Hexbyte Tech News Wired

CNMN Collection

© 2018 Condé Nast. All rights reserved.

Use of and/or registration on any portion of this site constitutes acceptance of our User Agreement (updated 5/25/18) and Privacy Policy and Cookie Statement (updated 5/25/18). Your California Privacy Rights. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of Condé Nast. Ad Choices.

Read More

Hexbyte  Tech News  Wired Boeing Issues Safety Warning After a Fatal 737 MAX Nosedive

Hexbyte Tech News Wired Boeing Issues Safety Warning After a Fatal 737 MAX Nosedive

Hexbyte Tech News Wired

Hexbyte  Tech News  Wired

U.S. investigators examine recovered parts of the Lion Air jet that crashed into the sea at Tanjung Priok Port in Jakarta, Indonesia. Divers have recovered the flight data recorder from the Boeing 737 MAX 8 plane.

Tatan Syuflana/AP

Hexbyte  Tech News  Wired

U.S. investigators examine recovered parts of the Lion Air jet that crashed into the sea at Tanjung Priok Port in Jakarta, Indonesia. Divers have recovered the flight data recorder from the Boeing 737 MAX 8 plane.

Tatan Syuflana/AP

Investigators are still working to discover exactly what went wrong with a Lion Air flight in Indonesia on Monday, October 29, when a Boeing 737 MAX plunged into the Java Sea, killing all 189 people onboard.

But the initial findings have highlighted a possible sensor problem, and that has been enough for Boeing to issue safety warnings to all the airlines that operate those planes, telling pilots to brush up on how to deal with confusing readings or erratic actions from the flight control computer, which could cause the planes to dive, hard. And now the FAA says it’s throwing its weight behind Boeing’s advisory, to make it mandatory for US airlines to comply.

Read More

Hexbyte  Tech News  Wired How to Hack an Election (Without Touching the Machines)

Hexbyte Tech News Wired How to Hack an Election (Without Touching the Machines)

Hexbyte Tech News Wired

Hexbyte  Tech News  Wired

Stephen Maturen/Getty Images

Hexbyte  Tech News  Wired

Stephen Maturen/Getty Images

On Monday morning, just 24 hours before polls opened in the US midterm elections, President Trump sounded an alarm with a Tweet: “Law Enforcement has been strongly notified to watch closely for any ILLEGAL VOTING which may take place in Tuesday’s Election (or Early Voting). Anyone caught will be subject to the Maximum Criminal Penalty allowed by law. Thank you!”

The rumor was part of a pair; over the weekend, Trump tweeted that Indiana senator Joe Donnelly was “trying to steal the election” by buying Facebook ads for the libertarian Senate candidate.

Read More

Hexbyte  Tech News  Wired Midterm Elections 2018: Net Neutrality Faces New Uncertainty

Hexbyte Tech News Wired Midterm Elections 2018: Net Neutrality Faces New Uncertainty

Hexbyte Tech News Wired

Hexbyte  Tech News  Wired

Senator-elect Marsha Blackburn of Tennessee is a critic of Obama-era net neutrality rules.

Alex Wong/Getty Images

Hexbyte  Tech News  Wired

Senator-elect Marsha Blackburn of Tennessee is a critic of Obama-era net neutrality rules.

Alex Wong/Getty Images

Tuesday’s midterms don’t shed much light on the future of net neutrality. But advocates do see rays of hope shining through the fog of uncertainty.

Democrats, who generally favor rules barring internet service providers like Comcast and Verizon from blocking or otherwise discriminating against content, took control of the House. And even after losing ground in the Senate, the party is tantalizingly close to having enough support from Senate Republicans to pass new net neutrality protections.

Read More

Hexbyte  Hacker News  Computers Learning Concepts with Energy Functions

Hexbyte Hacker News Computers Learning Concepts with Energy Functions

Hexbyte Hacker News Computers

We’ve developed an energy-based model that can quickly learn to identify and generate instances of concepts, such as near, above, between, closest, and furthest, expressed as sets of 2d points. Our model learns these concepts after only five demonstrations. We also show cross-domain transfer: we use concepts learned in a 2d particle environment to solve tasks on a 3-dimensional physics-based robot.

Hexbyte  Hacker News  Computers SpatialRegionV4

Many hallmarks of human intelligence, such as generalizing from limited experience, abstract reasoning and planning, analogical reasoning, creative problem solving, and capacity for language require the ability to consolidate experience into concepts, which act as basic building blocks of understanding and reasoning. Our technique enables agents to learn and extract concepts from tasks, then use these concepts to solve other tasks in various domains. For example, our model can use concepts learned in a two-dimensional particle environment to let it carry out the same task on a three-dimensional physics-based robotic environment – without retraining in the new environment.

A simulated robot trained via an energy-based model navigates its arm to be between two points, using a concept learned in a different 2D domain.

This work uses energy functions to let our agents learn to classify and generate simple concepts, which they can use to solve tasks like navigating between two points in dissimilar environments. Examples of concepts include visual (“red” or “square”), spatial (“inside”, “on top of”), temporal (“slow”, “after”), social (“aggressive”, “helpful”) among others. These concepts, once learned, act as basic building blocks of agent’s understanding and reasoning, as shown in other research from DeepMind and Vicarious.

Hexbyte  Hacker News  Computers example@2x

Energy functions let us build systems that can generate (left) and also identify (right) basic concepts, like the notion of a square.

Energy functions work by encoding a preference over states of the world, which allows an agent with different available actions (changing torque vs directly changing position) to learn a policy that works in different contexts – this roughly translates to the development of a conceptual understanding of simple things.

To create the energy function, we mathematically represent concepts as energy models. The idea of energy models is rooted in physics, with the intuition that observed events and states represent low-energy configurations.

We define an energy function E(x, a, w) for each concept in terms of:

  • The state of the world the model observes (x)
  • An attention mask (a) over entities in that state.
  • A continuous-valued vector (w), used as conditioning, that specifies the concept for which energy is being calculated

States of the world are composed of sets of entities and their properties and positions (like the dots below, which have both positional and colored properties). Attention masks, used for “identification”, represent a model’s focus on some set of entities. The energy model outputs a single positive number indicating whether the concept is satisfied (when energy is zero) or not (when energy is high). A concept is satisfied when an attention mask is focused on a set of entities that represent a concept, which requires both that the entities are in the correct positions (modification of x, or generation) and that the right entities are being focused on (modification of a, or identification).

We construct the energy function as a neural network based on the relational network architecture, which allows it to take an arbitrary number of entities as input. The parameters of this energy function are what is being optimized by our training procedure; other functions are derived implicitly from the energy function.

This approach lets us use energy functions to learn a single network that can perform both generation and recognition. This allows us to cross-employ concepts learned from generation to identification, and vice versa. (Note: This effect is already observed in animals via mirror neurons.)

Our training data is composed of trajectories of (attention mask, state), which we generate ahead of time for the specific concepts we’d like our model to learn. We train our model by giving it a set of demonstrations (typically 5) for a given concept set, and then give it a new environment (X0) and ask it to predict the next state (X1) and next attention mask (a). We optimize the energy function such that the next state and next attention mask found in the training data are assigned low energy values. Similar to generative models like variational autoencoders, the model is incentivized to learn values that usefully compress aspects of the task. We trained our model using a variety of concepts involving, visual, spatial, proximal, and temporal relations, and quantification in a two-dimensional particle environment.

Spatial Region Concepts: given demonstration 2D points (left), energy function over point placement is inferred (middle), stochastic gradient descent over energy is then used to generate new points (right)

We evaluated our approach across a suite of tasks designed to see how well our single system could learn to identify and generate things united by the same concept; our system can learn to classify and generate specific sets of spatial relationships, or can navigate entities through a scene in a specific way, or can develop good judgements for concepts like quantity (one, two, three, or more than three) or proximity.

Quantity Concept: demonstration attention is placed on one, two, three, or more than three entities. Inference is used to generate attention masks of similar quantity

Models perform better when they can share experience between learning to generate concepts (by moving entities within the state vector x) and identify them (by changing the attention mask over a fixed state vector): when we evaluated models trained on both of these operations, they performed better on each single operation than models trained only on that single operation alone. We also discovered indications of transfer learning – an energy function trained only on a recognition context performs well on generation, even without being explicitly trained to do so.

Proximity Concepts: demonstration events bring attention to the entity closest or furthest to the marker or to bring the marker to be closest or furthest to entity of a particular color (left). Inference is used to generate attention masks for closest or further entity (recognition) or to place the marker to be closest or furthest from an entity (generation) (right)

In the future we’re excited to explore a wider variety of concepts learned in richer, three-dimensional environments, integrate concepts with the decision-making policies of our agents (we have so far only looked at concepts as things learned from passive experience), and explore connections between concepts and language understanding. If you are interested in this line of research, consider working at OpenAI!


Acknowledgements

Thanks to those who contributed to this paper and blog post:

Blog post: Prafulla Dhariwal, Alex Nichol, Alec Radford, Yura Burda, Jack Clark, Greg Brockman, Ilya Sutskever, Ashley Pilipiszyn