Even large industrial laboratories, with the resources to design the largest and most complex systems, have reported an alarm. When Facebook tried to replicate AlphaGo, the system developed by Alphabet DeepMind to master the old game of Go, the researchers seemed exhausted by the task. Comprehensive IT requirements – millions of experiments running on thousands of devices for days – combined with unavailable code, have made the system "very difficult, if not impossible, to duplicate, study, improve and 'extend', they wrote in an article published in May (The Facebook team has finally succeeded.)
The AI2 search offers a solution to this problem. The idea is to provide more data on the experiences that have taken place. You can always report the best model that you got after, say, 100 experiments – the result that could be said to be "state of the art" – but you would also indicate the range of performance that you can expect if you do not only have the budget to try it. 10 times, or just eleven.
The point of replication, according to Dodge, is not exactly duplicating the results. This would be almost impossible given the natural random nature of neural networks and variations in hardware and code. The idea is rather to propose a road map to reach the same conclusions as the initial research, especially when it involves choosing the most appropriate machine learning system for a given task.
This could help the research become more efficient, says Dodge. When his team rebuilt some popular machine learning systems, they found that for some budgets, more obsolete methods were more useful than more vivid methods. The idea is to help smaller university labs by explaining to them how to get value for money. Another advantage, he adds, is that the approach could encourage a greener search, since the formation of larger models may require as much energy like lifetime emissions from a car.
Pineau is encouraged to see other people trying to "open the models", but she does not know if most labs will benefit from these cost-saving benefits. Many researchers would still feel compelled to use more computers to stay on the cutting edge of technology and then tackle the efficiency later. It is also difficult to generalize the way in which researchers must communicate their results, she adds. It is possible that AI2's "show his work" approach masks complexities in the selection of the best models by researchers.
These variations in methods partly explain why the NeurIPS Reproducibility Checklist is voluntary. Proprietary code and data is an obstacle, especially for industrial laboratories. If, for example, Facebook is researching your Instagram photos, public sharing of these data is a problem. Clinical research involving health data is another blocking point. "We do not want to cut researchers out of the community," she says.
In other words, it is difficult to develop reproducibility standards that work without hindering researchers, especially as methods evolve rapidly. But Pineau is optimistic. Another element of the NeurIPS Reproducibility effort is the challenge of asking other researchers to reproduce accepted articles. Compared to other areas, such as life sciences, where old methods die hard, the field is more open to the risks of placing researchers in this type of situation. "It's young in terms of people and technology," she says. "There is less inertia to fight."
. (tagsToTranslate) Artificial Intelligence (t) machine learning (t) neural networks
In the process, however, Uber may have helped energize critics who insist Big Tech companies be held accountable for their toxic negligence.
Looking Back: The California Legislature was passed this week and Governor Gavin Newsom is expected to sign a landmark bill designed to protect, among others, Uber and Lyft drivers – workers who work every day for business. employers, but who are nonetheless treated as independent contractors. You know, as if the drivers were only free agents who were taking customers as part of the commercial passenger transport plan that they had worked out and that they had started performing.
The measurement, called AB 5would rather require a company to treat its workers as employees if it exercises control over the way in which it performs its duties or if its work is part of the normal operations of the business. (And they certainly do.) The idea is to distinguish between the workers who move the company and the workers who are truly independent and who are truly engaged to do secondary work.
The change of status imposed by law This would mean that drivers – Uber and Lyft employ about 220,000 people in California alone – are entitled to unemployment insurance and family leave, earn minimum wage and overtime and have the right to bargain. By an estimationUber would need to spend $ 500 million to comply with the new law.
Faced with such a cost estimate, Tony West, Uber's senior lawyer, presented an ingenious defense: the company's drivers are not employees, even under the proposed new lawbecause Uber's main business does not involve people. The society, I told reporters Wednesday, "Serves as a technology platform for several types of digital markets." (Uber also offers food and freight delivery services, as well as rental bicycles and scooters.) So, said West, the entire stimulus package would be outside of the business course Uber "as stipulated by law.
This particular definition of Uber's business becomes important because the law was written to allow companies to rely on true independent contractors, such as those who spend a few days repainting the bathrooms. Or, uh, independent journalists, who were among those who get exemptions.
West appealed to Twitter to blame reporters who interpreted his statement as meaning that Uber was preparing to not comply with the law if the governor signed it. AT The New York Times, whose title was "Uber says it's not subject to California's law on unemployment insurance workers," I answered: "It's completely wrong. @Uber will absolutely respect the law – but the law does not require that contract workers be reclassified as employees. I have clearly said it today during a phone call with your reporters. Uber is a platform company, not a courtesy company. their story and they stick to it.
In response to West, a New York Times journalist, Noam Schieber, posted on Twitter a dialogue excerpt from a New York trial in which an Uber lawyer presented this argument to a judge:
Judge: How would Uber generate revenue via this app that we discussed without the platform drivers?
Uber: Right So you know – runners are the ones who need the drivers. They are the ones asking for the service; we are the market between that. Would it be ideal for us if there were no rides and no drivers? No, but we are the market – and we think we can create a balance between these two things.
Judge: But if you have users, people who ask for walks and no drivers, how would it work?
Uber: Could it work? To your knowledge, it would be a market that would not work properly.
Judge: Apparently not.
If you insist that Drivers are not the key to your business. Apparently, you're doing weird U-turns. Yet Uber is not the only one to deny in public what is obvious to anyone with a passing interest.
Similarly, Facebook and Google insist that they are not publishers. These are platforms, which publish works and apply algorithms so that people are sure to find what they want. White nationalism? Pseudoscience anti-vaccination? A publisher may be concerned about distributing such documents, but this is not the case. These are platforms, remember?
YouTube is not a TV channel. Are young children marketed with fraudulent advertising? Do the recommended videos encourage anger and isolation? A television channel can take care of or be required to take care of it. But a platform? Oh good
And now we learn that Uber is not a car rental company, but a platform. Do drivers live in poverty? Are they overworked? An employer is invested in the well-being of its workers and can be held responsible by unions and government regulations. But a platform? This is another word for the market, and the market does not believe in tears.
There is a reason why some people call Silicon Valley a melting pot of cruelty. The defense of the platform seems to be an easy justification for turning a blind eye to social destruction. But it is even more insidious to discard basic and proven standards of relationship, be it between storytellers or storytellers and their audiences, between hosts and guests, between employers and their employees.
Uber's resistance to treating drivers as employees stems from the ultimate goal of managing a fleet of driverless cars; drivers are only one solution. I guess getting closer to them now will make it even harder to say goodbye.
In a recent review by Malcolm Gladwell, new book, Speak to strangersAndrew Ferguson writes that he is surprised that Gladwell focuses on the harm done by the fact that people, even strangers, tend to trust each other. "One of the unfortunate side effects, which Gladwell considers in great detail, is that we are therefore ill equipped to detect liars," writes Ferguson. "Fortunately for us, most people tell the truth most of the time. Our faulty built-in lie detectors seem like a small price to pay for what is otherwise a vital social lubricant. "
(tagsToTranslate) Uber (t) Lyft (t) concert economy
Each is an individual action. Together, they take on an inordinate meaning. A YouTube video with 100,000 views seems more useful than a video out of 10, although the views – like almost every form of online engagement – can be easily purchased. It's a paradoxical love story. And it's far from an accident.
An increased commitment is good for business, and the impetus to check the score is an easy way to get users back. As CEO of Twitter, Jack Dorsey put it at last year's WIRED25 conference: "Right now, we have a big" I like "button that has a heart and we make people want it to increase" and to have more followers.
But these tactics are attracting more and more attention, about their impact on the health of the Internet and on society in general. Publicly measurable indicators – including opinions, retweets, or tastes – are "one of the driving forces behind radicalization," says Whitney Phillips, a media manipulation researcher and associate professor at the University of Toronto. Syracuse. It works both ways, she says. A user can be radicalized by consuming content, and a creator can be radicalized by users' reactions to their content as they adapt their behavior to what most interests their audience.
These concerns are driving some companies to explore ways to promote "conversational health". Over the past year, Facebook, Instagram (owned by Facebook), Twitter and YouTube have decided to mitigate or eliminate key indicators to promote health. user engagement The trend has given rise to a word that you will not find in dictionaries: demetrization.
Yet, some of the users they were supposed to help, who view metrics as an essential part of their experience, decried the changes. This leaves the platforms in the delicate position of detoxification of the users of an addiction that they had initially introduced.
Over the past year, even rumors of demolition have plunged users into panic. When Dorsey responded to his comments about "I love" wondering if the button itself should exist, people turned away. User panic reaches its climax a few days later, after a Telegraph report detailed a meeting at which Dorsey would have asked the utility of the button "I love" and reportedly said it could disappear "in a short time". decry the decision, with many publication updates threatening to leave the platform if left unwelcome; Dozens of tweets criticizing the idea quickly became viral – without Twitter saying anything about the fate of this feature.
The same thing happened in March, after the users got wind of the test twttr Twitter application semi-public application, nicknamed internally "small t", which hid some as retweet. The change was intended to encourage users to focus on the content of tweets rather than those that were most popular.
The metrics were visible to the "little users" who had typed on the tweet, and the update had not been pushed to the official Twitter app. But the first reports on the feature sent users in a tizzy. The indignation over the possible change was so fast that Twitter issued a statement that it was just a test.
In recent months, Twitter has continued to test functionality among "small users," with mixed results. The experiment has resulted in a reduction in overall commitment, confirmed a spokesman, which bodes ill of his chances of being pushed to the main application of Twitter.
. (tagsToTranslate) Social Networking (t) Facebook (t) twitter (t) Instagram (t) YouTube
At WIRED, that kind of thing drives us crazy, and not in a good way. Because our natural state, the nature of WIRED's evolving trends, is to find our way to a better place through the genius of science and technology, in the minds of people who know how to steer it into the future and say we can do better. Humans can build great things. In fact, great things are being created there.
That's why, from November 7 to 10, WIRED brought together a group of tenacious people – people who inspire us, who remind us that we have overcome the malaise before and are now overcoming it. To our Second Annual WIRED25 Festival, San Franciscowe will have conversations with technical leaders and scientific luminaries, heads of organizations (from the NSA in Slack) and children committed to improving our climate. And the writers, actors and creators of science fiction. The event is also in keeping with the November issue of WIRED and our list of 25 people who, in 2019, are using science and technology to create a future we all want to live in.
Take Dawn Song His startup, Oasis, uses the blockchain for make our online interactions safer and more secure. Or Jack Conte, CEO of Patreon, which allows artists and musicians to make money on the Internet, a place where information is free, but where artists need to be at least a bit expensive. And the wonderful N. K. Jemisin, the only sci-fi writer to win three consecutive Hugo awards and, let's face it, blowing our mind with its building the world.
In fact, on stage, dozens of icons and awakeners will join the reporters and editors of WIRED, including
It will be inspiring, enlightening. But the weekend will be fun too. Saturday and Sunday will be filled with screenings of music and movies, as well as robots and paper planes. The actor Chris Evans will project his new film Knives outside-With writer and director Rian Johnson – and talk about his new project to help our democracy to stand up straight.
Come join us (you can buy tickets right here) and be revitalized by the accursed intelligent people who are committed to solving the world's problems. To hell with the enemies. We can do that.
. (tagsToTranslate) WIRED25</pre></pre>
On Monday, the attorneys general of 48 states, as well as the District of Columbia and Puerto Rico, said they opened an investigation to determine whether Google abuses its power as the dominant Internet search provider.
The news follows a similar Facebook survey revealed Friday by eight states and DC, led by New York Attorney General Letitia James. Meanwhile, the Ministry of Justice and Federal Trade Commission lead their own antitrust surveys of large technology companies. The Ministry of Justice investigation apparently would encompass Apple and Google, while the FTC would be interested in Facebook and Amazon.
At a press conference in Washington on Monday, Texas Attorney General Ken Paxton, who heads Google's investigation, pointed out that states are still collecting information and declined to discuss the measures that would be taken. They could take it if they found evidence of anti-competitive behavior. But it is clear that the attorneys general concerned about the dominance of Google on the search engine.
"There is nothing wrong with being the dominant player if this is done fairly," Utah Attorney General Sean Reyes said at the press conference. "But there is a reason why so many of us have come together.
The survey is focused on online advertising, especially with regard to research. Other aspects of Google's activities, such as data privacy issues or its licensing rules for the Android operating system, are not at the center of the investigation, yet.
Several attorneys general pointed to the company's growing tendency to offer links to its own services, or paid advertising, over unpaid links in search results.
The only states not involved in the Google survey are California and Alabama. "California remains deeply concerned and engaged in the fight against anti-competitive behavior," said a spokesman for Attorney General Xavier Becerra. "But to protect the integrity of our work, we can not make any comment – to confirm or deny any ongoing or potential investigation."
"We have always worked constructively with regulators and we will continue to do so," Kent Walker, senior vice president of international business at Google, said in a blog. "We are eager to show how we are investing in innovation, providing the services people want, and participating in strong and loyal competition." Facebook has not responded to a request for comment.
States are far from being the first to raise these issues. The FTC of the time Obama investigated the potentially anticompetitive behavior of Google, but in 2013, it decided not to prosecute after a few changes, including allowing companies like TripAdvisor and Yelp not to use their content in their own services. The The European Union fined Google about 9 billion dollars in three cases involving alleged anti-competitive practices since 2017, including one to prioritize its own content over others in the search results.
States can add considerable firepower to cases involving reprehensible behavior of a company. For example, 20 state attorneys general joined the Department of Justice as part of the historic antitrust proceeding against Microsoft launched in 1998.
Attorneys General of 46 states have entered into a "framework agreement" with the four largest tobacco companies. States like Oklahoma are also behind a large part of the lawsuits filed against pharmaceutical companies facing the opioid crisis.
States sometimes disagreed with federal regulators on technology policy. In 2017, New York Attorney General Eric Schneiderman announced a multi-state trial against the Federal Communications Commission in an attempt to prevent the agency from giving up its protections relating to the neutrality of the internet; other states have passed their own laws prohibiting ISPs from blocking or otherwise discriminating legal content. In June, nine states and the District of Columbia filed a lawsuit against block the acquisition of Sprint by T-Mobile, that the FCC and the Department of Justice approved.
These kinds of videos generated by artificial intelligence, called "deepfakes", are disturbing. Imagine the faces of politicians, activists or journalists superimposed in a pornographic video, for example. Or think about how a bully could use technology to torment classmates.
For the moment tells how to disappear the features of the face or audio issues make deepfakes relatively easy to spot. But the technology is continually improving. Mike Schroepfer, CTO Facebook, fears that artificial intelligence experts spend too much time perfecting deepfakes and not enough time to find ways to detect them.
Thursday, Facebook, Microsoft, the Partnership on AI coalition, and academics from seven universities launched a competition to encourage better methods of detecting deepfakes. The organizers of the Deepfake Challenge Challenge have not specified a prize. The competition will run from the end of 2019 to the spring of 2020.
After being screened, attendees will have access to a collection of deepfake videos that Facebook plans to release by December. They will feature professional actors who have agreed that their faces be used in deepfakes, but according to Schroepfer, the videos in the dataset will look as much as possible for real videos on Facebook. No Facebook user data will be used. You will not have to use Facebook videos to enter the contest.
Contests are a common way to encourage researchers and hobbyists to find solutions to complex computer problems. Netflix famous organized a contest to build a better movie recommendation engine and the website Kaggle organizes data science competitions for a wide range of businesses and organizations.
A big challenge for deepfake researchers is that, as with any work on AI, researchers need many examples of deepfakes in order to "train" a system for tracking faked videos. According to Schroepfer, it is relatively easy to detect a serious error when a system has already "seen" the original video. For example, if a system is already familiar with the original movie clip used in Forrest Gump, it can mean that something has changed. But if a malicious actor records an original video and that it processes it, it is much harder to detect by an AI system. The deepfakes created by Facebook are meant to help solve this problem.
Schroepfer says that Facebook is already implementing all known detection methods, but hopes the contest will generate new ways to detect deepfakes. The idea is not to create a system that will stop all deepfakes forever, but to find ways to make it more difficult and more expensive to create passable deepfakes.