Kids today are voracious technology consumers for sure, but they are also active participants—creators, collaborators, and even influencers. So parents have much more to wrestle with than some broad-brush notion of “screen time.” Whatever you think of Fortnite, when children play it they are working together as a team (that’s good!), in a tech-mediated environment (is that bad?); and they’re not just playing a videogame, they’re socializing—factor that into your screen-time calculus.
The youngs are also creating their own content, from unboxing toy reviews and “watch me play Minecraft” videos to DIY slime posts on Instagram, some of them attracting massive audiences and making serious money. They have agency and aspirations online, which makes them vulnerable to the same forces and incentives that continue to transform society. Technology is their native environment as much as the physical world is, and it’s up to parents—most of whom have their own tech-related issues to deal with—to help them thrive in this often mystifying hybrid existence. It’s nothing to freak out about. We’ll all get through this together.
A decade ago, Amazon abruptly deleted copies of George Orwell’s 1984 from the Kindles of its American customers. The move instantly evoked the “memory holes” in the novel’s totalitarian dystopia, and it inspired about equal measures of shock, outrage, and jokes. (If a fictional Amazon in a dystopian novel had performed the same mass deletion, critics would have said it was too on the nose.) But in hindsight, Amazon’s action was also a striking harbinger of a shift that has only become more pronounced since then: our wholesale tilt toward becoming a tenant society.
In that particular case, Amazon said the books had been added to the Kindle Store by a vendor who didn’t actually have the rights to them. “When we were notified of this by the rights holder, we removed the illegal copies from our systems and from customers’ devices, and refunded customers,” said a spokesperson at the time. Amazon quickly apologized and said that in the future it would leave books on people’s devices even if there was an error in how they got there. But one thing the company couldn’t take back was the demonstration of its sheer power. Even the biggest traditional retailer could hardly dream of reaching into people’s houses and taking back what it had sold them.
Today, we may think we own things because we paid for them and brought them home, but as long as they run software or have digital connectivity, the sellers continue to have control over the product. We are renters of our own objects, there by the grace of the true owner.
Of course, “smart,” connected machines do come with plenty of upsides. A modern washing machine doesn’t just agitate the clothes around for a fixed amount of time; it senses water levels and dampness and can adjust how long it spins so your clothes come out at just the right level of dryness. Cars are more fuel-efficient because their computers optimize many aspects of their operation, from fuel injection to braking. All of this is good for the environment and your wallet.
Desperate farmers have taken to hanging out in shady internet forums, looking for software that will get around John Deere’s locks, trying to repair the tractors they ostensibly own.
But that is not all that’s happening. Connectivity and embedded intelligence are being used by large corporations to increase their profits and to exercise as much control as they can get away with. Perhaps the most egregious example involves John Deere tractors—those iconic, bright green giants that rumble across big fields, noisily harvesting wheat, corn, and soy. For generations, farmers have repaired their tractors right on the farmstead. But in its push toward building ever more automated, sensor-packed agricultural equipment, John Deere has put draconian software locks on its tractors, forcing customers to visit the company’s own repair shops. Farmers complain they are charged exorbitant sums for even simple repairs. And they lose crucial time heading out to the shop during the harvest season. Desperate farmers have taken to hanging out in shady internet forums, looking for software that will get around John Deere’s locks, trying to assert their right to repair the tractors they ostensibly own.
Apple, too, has waged a scorched-earth campaign against anyone with the audacity to repair its products or replace its batteries. In 2017 it came to light that the company was secretly throttling iPhones with older batteries, slowing down their performance. Apple said it was doing so only to keep the aging phones from crashing outright. This technical reason for the move made sense; the fact that it was secret did not. And it was hard to ignore the ways the policy might also benefit the company. Users with increasingly slow phones were, in effect, being nudged to purchase a new device, allowing Apple to increase its already hefty profit margins.
After the scandal, an embarrassed Apple offered cheap replacement batteries. It soon became clear why the company held the line against them for so long. New iPhone sales went down; Tim Cook told shareholders that increased repairs were “a factor” in this trend.
More recently, Apple has reportedly cut a deal with Amazon to remove “unauthorized” refurbishers of Apple products—people who resell repaired machines—from the Amazon marketplace. In return, it will let Amazon sell new Apple products: a win-win for the two giants, but not for consumers. Apple also forces recyclers to shred old iPhones and Macbooks rather than reuse their parts and materials. That’s definitely bad not just for consumers but also for the environment.
But this isn’t merely a fight over prices and profit margins. What happens when you do something with your car, phone, or other object that corporate headquarters really doesn’t like? Our connected devices can simply be bricked on command. Cars have been immobilized, for example, when the ostensible owner fell behind on payments by as little as three days. John Deere tractors with “unauthorized repairs” have been similarly taken out of commission. How long before other devices start behaving as spies and taskmasters in our own home? Will the coffee maker let us have that seventh cup that the doctor advised us against?
It’s true that repairs of complicated gadgets may sometimes need to be done by licensed parties. But rather than more secrecy and exclusive control, companies could expand the base of people capable of doing the work. It’s also true that connectivity is necessary for devices that run software; bugs need to be fixed and software updated. But there is no reason for that to be an unbounded license to brick a device or erase its content.
In March, US senator and Democratic presidential candidate Elizabeth Warren called for Congress to pass a national right-to-repair law that “empowers farmers to repair their equipment without going to an authorized agent.” But it’s not just farmers. It’s all of us. We have fewer rights as digital tenants than we do as tenants of real estate, where eviction is subject to due process. If we purchase something, it is ours. We shouldn’t let ownership go down the memory hole.
Zeynep Tufekci(@zeynep) is a WIRED contributor and a professor at the University of North Carolina at Chapel Hill.
Trees, from the mighty redwoods to slender dogwoods, would be nothing without their microbial sidekicks. Millions of species of fungi and bacteria swap nutrients between soil and the roots of trees, forming a vast, interconnected web of organisms throughout the woods. Now, for the first time, scientists have mapped this “wood wide web” on a global scale, using a database of more than 28,000 tree species living in more than 70 countries.
“I haven’t seen anybody do anything like that before,” says Kathleen Treseder, an ecologist at the University of California, Irvine. “I wish I had thought of it.”
Before scientists could map the forest’s underground ecosystem, they needed to know something more basic: where trees live. Ecologist Thomas Crowther, now at ETH Zurich in Switzerland, gathered vast amounts of data on this starting in 2012, from government agencies and individual scientists who had identified trees and measured their sizes around the world. In 2015, he mapped trees’ global distribution and reported that Earth has about 3 trillion trees.
Inspired by that paper, Kabir Peay, a biologist at Stanford University in Palo Alto, California, emailed Crowther and suggested doing the same for the web of underground organisms that connects forest trees. Each tree in Crowther’s database is closely associated with certain types of microbes. For example, oak and pine tree roots are surrounded by ectomycorrhizal (EM) fungi that can build vast underground networks in their search for nutrients. Maple and cedar trees, by contrast, prefer arbuscular mycorrhizae (AM), which burrow directly into trees’ root cells but form smaller soil webs. Still other trees, mainly in the legume family (related to crop plants such as soybeans and peanuts), associate with bacteria that turn nitrogen from the atmosphere into usable plant food, a process known as “fixing” nitrogen.
The researchers wrote a computer algorithm to search for correlations between the EM-, AM-, and nitrogen-fixer–associated trees in Crowther’s database and local environmental factors such as temperature, precipitation, soil chemistry, and topography. They then used the correlations found by the algorithm to fill in the global map and predict what kinds of fungi would live in places where they didn’t have data, which included much of Africa and Asia.
Local climate sets the stage for the wood wide web, the team reports today in Nature. In cool temperate and boreal forests, where wood and organic matter decay slowly, network-building EM fungi rule. About four in five trees in these regions associate with these fungi, the authors found, suggesting the webs found in local studies indeed permeate the soils of North America, Europe, and Asia.
By contrast, in the warmer tropics where wood and organic matter decay quickly, AM fungi dominate. These fungi form smaller webs and do less intertree swapping, meaning the tropical wood wide web is likely more localized. About 90% of all tree species associate with AM fungi; the vast majority are clustered in the hyperdiverse tropics. Nitrogen fixers were most abundant in hot, dry places such as the desert of the U.S. Southwest.
Charlie Koven, an Earth system scientist at the Lawrence Berkeley National Laboratory in California, applauds what he says is the first global forest microbe map. But he wonders whether the authors missed some important factors that also shape the underground world. Hard-to-measure processes such as nutrient and gas loss from the soil could affect where different microbes live; if so, the study’s predictions could be less accurate, he says.
Despite such uncertainties, having the first hard numbers for which tree-associated microbes live where will be “very useful,” Treseder says. The findings could, for example, help researchers build better computer models to predict how much carbon forests will squirrel away and how much they will spew into the atmosphere as the climate warms, she says.
Crowther, however, is ready to make a prediction now. His results suggest that as the planet warms, about 10% of EM-associated trees could be replaced by AM-associated trees. Microbes in forests dominated by AM fungi churn through carbon-containing organic matter faster, so they could liberate lots of heat-trapping carbon dioxide quickly, potentially accelerating a climate change process that is already happening at a frightening pace.
That argument is “a little bit more tenuous” to Treseder. She says scientists are still puzzling out how different soil fungi interact with carbon. But, she adds, “I’m willing to be convinced.”
I’ve gone through many iterations of a Home NAS over the years. It started with an AMD Phenom based system running in a Norco case with about 8 hard SATA hard drives (4 primary, 4 backup via Rsync). The noise and power consumption of that system was overkill for what it was used for, so I sold it and went to a four-bay QNAP NAS a few years ago. With some hard drive capacity upgrades at the time too, I was able to reduce that to three primary drives (all 4TB) that backed up to two external USB 3.0 drives overnight.
The QNAP was great, but I always had a few lingering concerns with that setup:
When I hit four drives, I’d be done. I’d need to rip it out and upgrade, and the cost of a purpose made 6++ bay NAS was astronomical for my home budget.
The file system of all the internal drives, even when I ran them as single disks, was setup as RAID (yes, it somehow made each single disk it’s own single disk RAID system, rather than just using EXT4 or a similar single disk file system). This made it had to pull them out of the NAS and put them into any other system without hoping I could rebuild the single disk array and not lose my files. For where I sit with Linux comfort, I didn’t want this.
It was heading EOL for updates, and recent updates had really started to slow down it’s aging ARM processor.
Time for something new.
I wanted the following requirements:
Shoestring budget, under $300 preferred
Scalable, not locked into a certain number of drive bays
USB 3.0 (for fast backups to my backup drives)
Preferred: compatibility with open source NAS offerings (OpenMediaVault, FreeNAS, etc.)
Low power usage (I don’t have high computing requirements for this system, so it doesn’t need to be driving up my electricity bill to run every month)
Turns out I had pretty slim pickings in this area. I knew pretty early on that I’d be using a Single-board Computer (SBC) style approach, and this narrowed me down to the following contenders:
Hexbyte News Computers Rock64
Had USB 3.0, gigabit ethernet, a well supported custom build of OpenMediaVault, and was only $45 CAD! At that price, I could even order a second and instead of running my backups across the USB 3.0 bus from one drive to another, I could backup across gigabit LAN from Primary NAS to Backup NAS (maybe in the future!).
Hexbyte News Computers ODROID XU4
Had USB 3.0, had gigabit ethernet, had OpenMediaVault support, had a good community and strong Linux support, but it was more than twice the price of the Rock64 in Canada, and the Rock64 offered me all the same capabilities.
Hexbyte News Computers Helios 4
This board had four native SATA ports and USB 3.0, all at a fairly attractive price. The turn off? It was a Kickstarter project, and even though they’re operating on their own now, it’s still “pre order” only, with no idea when the next batch will arrive. Those who did have them were pretty open with the fact that some kinks are still being ironed out. It looks like the board is being made in partnership with Solidrun, and really has some potential. I’ll be watching this one as a potential upgrade for sure.
Hexbyte News Computers Others
I built at an interesting time – both Pine64 (the makers of Rock64) and ODROID are coming out with really attractive looking boards based on the new RK3399 chipset. The RockPro64 will have USB 3.0 (both Type A and Type C), as well as PCI-E x1 where I could easily add a native SATA card. ODROID is coming out with the N1, which will have 2x native SATAIII headers, and USB 3.0. Both are attractive looking NAS options!
I figure it could take a good 6-12 months until these hit the market and have strong OpenMediaVault support built up, so I went with the Rock64. When these boards come out, I may very easily buy one of them to replace the Rock64 and then the Rock64 can run four drives as my backup NAS so I’m not running my rsync backups over the same USB bus that all drives share.
I needed some supplementary parts to complete this build, so here was the total parts list:
Rock64 – $46.50 CAD
Power Supply – $15.95 CAD
32GB MicroSDXC – $17.98 CAD (you could go smaller than 32gb and do this for about $10)
4x Orico 3569S3 UASP USB 3.0 Drive enclosure – $22.99 CAD x4 (bought one spare for planned drive capacity increase)
TOTAL: $172.39 CAD before tax
I already had all my drives, a mix of HGST and Western Digital NAS drives that were coming from the QNAP into the new Orico enclosures, and I already had my backup drives in pre-existing dual-drive USB 3.0 enclosures too. The drives needed a USB 3.0 hub, which I also had laying around, but can be had for $15-20 for a quality one on Amazon.
I decided to grab the latest OMV 4.0 release for the Rock64 from Ayufan’s Github (he’s the go-to guy for building and maintaining the OMV Rock64 builds). At first I wasn’t sure if I wanted his armhf or arm64 build, but the OMV forum recommended armhf for lower memory usage.
I flashed my SD card, booted the Rock64, and was at a login prompt all in under 10 minutes.
All my primary drives were re-formatted to be EXT4, my backups restored, and it was time to simply customize the NAS as needed. I setup some AFS and SMB shares for my Mac and Windows machines, some NFS shares for my Raspberry Pi and Kodi boxes, and some other misc. setting changes as I felt fit.
This is the kicker! I was expecting good performance from USB 3.0 given my use cases. I realize all drives are sharing a USB 3.0 bus, but I’m never in a situation where I’m hammering all the drives at once and would saturate that bus heavily anyway. The NAS is simply home storage for all our computers/devices to share, not much more.
After all the recent updates, my old QNAP NAS would average around 17-22mbps transfer speeds. The Rock64 build? I was averaging 60-113mbps depending on the size of files being moved. This was a successful build in my mind! After selling my QNAP NAS, I came out ahead on my expenses for the build.
Stability has been great so far; realizing that I’m running a highly experimental OMV 4.0 build and not the stable OMV 3.0 build. I had one drive that kept disappearing, but that seemed to trace back to a faulty drive enclosure and since I’ve moved it to a new enclosure, it’s remained mounted ever since. I also have some random errors popping up when I try to set a static IP, and I haven’t gotten around those yet. I assume them to likely be due to the “bleeding edge” nature of me running an early alpha/beta OMV build – I just statically assigned the board in my router to overcome that and will keep an eye on future updates before engaging the developers. Any other errors were usually due to me trying to make changes to a drive that was still active (be it under activity, currently mounted, etc. – so human error).
I can’t wait to keep playing this setup, and also seeing what some of the new SBC boards coming to the market hold as future updates could go. The best part of this setup is it’s modular and very low cost – I can easily swap out just the Rock64 board as newer and faster/better SBCs become available and never break the bank (and probably never exceed $100 or so either); not nearly as bad as swapping out a $500-700 purpose built NAS would be!
Questions on this build? Leave them in the comments below and I’ll get back to you!
One of the most important ways that Microsoft wants to make the new Chromium-based Edge different from the current EdgeHTML-based Edge is in its support for other platforms. The original Edge was, for no good reason, tied to Windows 10, meaning that Web developers on platforms such as Windows 7 or macOS had no way of testing how their pages looked, short of firing up a Windows 10 virtual machine.
The new browser is, in contrast, a cross-platform affair. The first preview builds were published for Windows 10, with versions for Windows 7, 8, and 8.1 promised soon; today, these are joined by builds for macOS.
The macOS version resembles the Windows 10 builds that we’ve seen so far, but it isn’t identical. Microsoft wants to be a good citizen on macOS by producing not just an application that fits the platform’s standards—using the right fonts, icons, spacing, and so on—but which also adapts to Apple’s unique hardware. To that end, the company is working on support for the Touch Bar found
Nostalgic World of Warcraft (WoW) fans have been calling for game publisher Activision-Blizzard to release World of Warcraft Classic for years, and they’re finally getting their wish. World of Warcraft Classic is now in beta, but some players have been surprised by what they’ve found when playing it.
WoW Classic seeks to recreate the “vanilla WoW” experience—that is, WoW as it existed before a series of seven game-altering major expansion packs from 2007’s The Burning Crusade to 2018’s Battle for Azeroth. To achieve this, Blizzard has rebuilt the game based on archived data from back in 2005 and 2006 (patch 1.12 is the goalpost—the current game is on patch 8.1.5). The company has committed to meticulously presenting the experience exactly as it was back then—warts and all—with only a small number of unavoidable or critical changes.
The argument for this is simple: what makes classic WoW great to one player might be different from what makes it great for another. And who are Blizzard’s designers to say which old features were just good or bad design for each player? It’s an approach that shows Blizzard believes (at least to some degree) that WoW doesn’t just belong to its creators but to its fans. That struggle between authorial intent or game design orthodoxy and “the player is always right” is at the heart of many of gaming’s big contemporary controversies. But so far, Blizzard seems committed to its plan with regards to WoW Classic.
So committed, in fact, that modern WoW players are trying the beta and reporting what seem like bugs today but what were actually intended functionality 13 years ago. This became such a common occurrence that Blizzard publicly posted a list of known non-issues called the “WoW Classic ‘Not a Bug’ List.” For e
Global health efforts to design and deliver improved cookstoves don’t always catch on. Experience has shown poor households in rural settings will rarely pay for or use these new stoves, which are intended to lower firewood demands and improve indoor and outdoor air quality.
However, adopting some common business practices, such as upgrading the supply chain, performing careful market analysis and offering price rebates, can increase purchase and adoption of improved cookstoves by as much as 50 percent in rural India, according to a new study led by Duke University researchers.
Three billion people still rely on traditional cookstoves that use solid fuels such as wood or coal. These stoves contribute to climate change through carbon emissions, deforestation and toxic air pollution, which contributes to poor health among users and their communities.
Improved cookstoves use either electricity or biomass as an energy source. Switching to them can deliver ‘triple wins’: better household health, better environmental health and reduced climate change emissions.
The adoption of improved cookstoves has been slow, however, likely because of constraints imposed by differences in markets, culture and geography.
“Previous studies have found low demand for these cookstoves, however, our study found that when barriers to adopting the stoves were addressed, the demand was high,” said Subhrendu Pattanayak, Oak Professor of Environmental and Energy Policy at Duke’s Sanford School of Public Policy and lead author of the study.
“A big question for policy scientists has been: Can we figure out what technology and energy and environmental services people want, and use that understanding to get people to pay for them? Our study shows that we can,” he said.
The Duke researchers took a novel approach by implementing the study in three phases—diagnose, design and test—over a period of five years.
In the first phase, the researchers analyzed existing research on improved cookstove adoption, and looked at sales across different potential study communities, which provided insight into both demand- and supply-side barriers to adoption. They found no common strategies for promoting changes in cooking behavior, but instead concluded the socio-economic case for adoption was influenced by local context. They then conducted focus groups in more than 100 households in 11 rural Indian communities, which allowed researchers to understand local cooking practices, perceptions of different stoves and preferences for stove features.
In the design phase, researchers worked with local organizations to implement eight small pilot programs in three different settings. This included small-scale testing of various supply chain issues such as marketing and home delivery, rebates and financing, and offers of electric and/or biomass cookstoves.
In the third phase, they conducted a field experiment to determine whether the combination of upgraded supply and demand promotion would lead to increased adoption of improved cookstoves. The field test included nearly 1,000 households in 97 geographically distinct villages in the Indian Himalayas.
The experiment showed that more than half of the intervention households bought an improved cookstove compared with zero purchases in the control villages. The demand was very price-sensitive, and the largest rebate, 33 percent of retail price, led to the largest purchase rate, 74 percent. In the areas that only had an upgraded supply chain and promotion without rebates, there was a 28 percent increase in ownership of the improved cookstoves.
Households overwhelmingly preferred the electric stove over the biomass stove, by a factor of two to one. Respondents liked the lack of smoke, speed of cooking and portability and attractiveness of the stove.
However, this preference for electric stoves highlighted the lack of a steady source of electricity. In India, rural electrification rates have been rising rapidly, growing from 57 to 83 percent between 2005 and 2015.
“Our work shows how energy access programs and projects can scale up and achieve success by understanding local demand and developing robust regional supply chains,” said co-author Marc Jeuland, professor of public policy and global health at Sanford.
The researchers also looked at whether stove ownership persisted over time, going back to the households three and 15 months after purchase. Most households still owned the stoves, although at the later date, 15 percent of households reported their stoves needed repair. The authors argue that barriers to improved cookstove adoption can be overcome and that households are willing to pay substantial prices for them, but that maintenance and sustainability require additional attention.
The interventions that helped improve adoption rates are similar to common marketing and sales practices of private firms, Pattanayak said. While the interventions appear costly on paper, the costs are less than the social and environmental benefits gained.
“Our findings suggest that market analysis, robust supply chains and price discounts are critical for improved cookstove adoption,” said Pattanayak.
The study appears the week of May 20 in the Proceedings of the National Academy of Sciences of the United States of America.
Giving rural Indians what they want increases demand for cookstoves (2019, May 20)
retrieved 21 May 2019
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.
Reviewing the KINGPIN 2080 Ti is a tough job, but we’ve looked at it for “regular” use, water-cooled overclocking, liquid nitrogen overclocking, and more. Ad: Buy Thermal Grizzly Kryonaut on Amazon (http://geni.us/gntgkryo) or Conductonaut Liquid Metal on Amazon (http://geni.us/gntgconductonaut)
Article is pending publication — sorry for the delay! We’re flying right now. Find the KINGPIN 2080 Ti on Newegg: https://geni.us/8TJQ
The EVGA RTX 2080 Ti KINGPIN is among the highest-end cards built for the big GPU Turing die, and at $1900, it’s also among the most expensive. This review of the EVGA Kingpin 2080 Ti looks at overclocking with the stock (liquid) cooler, extreme overclocking with liquid nitrogen, power consumption, thermal performance, and overall value and quality of the card. If you’re looking for the “best” RTX 2080 Ti, well, clearly that search ends at something like the Lightning, the HOF, or the KINGPIN, but these cards aren’t meant for just any user.
A top Fortnite player is suing his gaming organization for allegedly restricting his business opportunities with an “oppressive, onerous and one-sided” contract.
Turner Tenney, a 21-year-old professional gamer known as Tfue, has filed a lawsuit against esports team Faze Clan for allegedly limiting his ability to pursue his profession in violation of California law and the Talent Agency Act. The lawsuit also alleges that Faze Clan has failed to pay him his share of his sponsorship earnings and taken up to 80% of his earnings as a “finder’s fee.”
“In no uncertain terms, these gamers are artists, entertainers and content creators — they perform, they act, they direct, they edit and they stream,” Turner’s lawyers wrote in the complaint, filed Monday in Los Angeles Superior Court.
“Because the industry is so new, there is little to no oversight,” according to the complaint, a copy of which was obtained by The Hollywood Reporter. “As a result, these young content creator/streamers are susceptible to being taken advantage of and exploited — often by those that are supposed to be looking out for their best interests.”
Los Angeles-based Faze Clan didn’t immediately respond to a request for comment.
“Faze Clan uses its illegal Gamer Contracts to limit Tenney to deals sourced exclusively by Faze Clan and to prevent Tenney from exploring deals presented by others; deals that are potentially superior to deals procured by Faze Clan; and deals that are not saddled with an eighty percent (80%) finder’s fee,” the complaint says.
Tenney tried to terminate his contract with Faze Clan in September based on the organization’s numerous breaches of contract, but Faze Clan disputes that the agreement was terminated and contends that Tenney still owes it contractual obligations, the complaint says.
Now playing: Watch this:
HP Pavilion Gaming Laptop is more interesting than its…