This is the end of the culture of presenteeism and micromanagement. Some companies may be tempted to use spyware to keep an eye on employees but they will quickly realize that it is deeply retrograde and makes no sense. The only alternative is a change of culture, with an intense focus on trust and empowerment that allows employees to make their own decisions: except for the hours when a meeting is set, the rest of the time must be self-managed. Whether I prefer to work in the morning, afternoon or evening is my business, as long as I meet my goals. If you expect me to answer a message immediately at ten o’clock at night and you get angry if I don’t, you are a fool — although that doesn’t mean I can’t do so sometimes, if I think it’s appropriate. A people-centered culture, with all that that entails.
It was supposed to be groundbreaking. When New York City’s task force to develop policy on algorithm technologies was introduced two years ago, it was praised as a beacon of transparent and equitable government. It was supposed to inform other policymakers grappling with how to address their own use of automated technologies that make decisions in place of humans.
But for all its good intentions, the effort was bogged down in a bureaucratic morass. The task force failed at even completing a first necessary stepin its work: getting access to basic information about automated systems already in use, according to task force members and observers
Over the past decade, algorithmic accountability has become an important concern for social scientists, computer scientists, journalists, and lawyers. Exposés have sparked vibrant debates about algorithmic sentencing. Researchers have exposed tech giants showing women ads for lower-paying jobs, discriminating against the aged, deploying deceptive dark patterns to trick consumers into buying things, and manipulating users toward rabbit holes of extremist content. Public-spirited regulators have begun to address algorithmic transparency and online fairness, building on the work of legal scholars who have called for technological due process, platform neutrality, and nondiscrimination principles.
As much as I look into what’s being done with deep learning, I see they’re all stuck there on the level of associations. Curve fitting. That sounds like sacrilege, to say that all the impressive achievements of deep learning amount to just fitting a curve to data. From the point of view of the mathematical hierarchy, no matter how skillfully you manipulate the data and what you read into the data when you manipulate it, it’s still a curve-fitting exercise, albeit complex and nontrivial.
SAN FRANCISCO — In recent months, satellite photos have streamed into a former textile factory here revealing a build-up of potentRussian air defense systems in Ukraine, a serious new threat to NATO aircraft.This is not a secret CIA facility, and the images didn’t come from a billion-dollar surveillance satellite.They were taken by private spacecraft — some the size of a loaf of bread — operated by Planet Labs, a Silicon Valley company that is leading a revolution in how humans glimpse Earth from space.A short stroll from the downtown San Francisco headquarters of Yelp and LinkedIn, Planet operates the largest and least expensive fleet of satellites in history — the first to take pictures of the entire landmass of the globe, once a day, and sell them to the public.
Since the introduction of so-called less-lethal weapons in France in the late 1990s, there has been no legal requirement to collect data on injuries induced by kinetic impact projectiles, and no epidemiological surveys have been planned. To estimate the number of patients with ocular injuries caused by the use of these defensive tools, a retrospective survey was sent to all ophthalmology department chairs in French university hospitals, which are where the most severe cases are managed.
Since August, as vast stretches of the Amazon rainforest were being reduced to ashes and outrage and calls for action intensified, a group of lawyers and activists who have been advancing a radical idea have seen a silver lining in the unfolding tragedy: One day, a few years from now, they imagined Brazil’s president, Jair Bolsonaro, being hauled to The Hague to stand trial for ecocide, a term broadly understood to mean the willful and widespread destruction of the environment, and one that, they hope, will eventually be on par with other crimes against humanity
It is the technology that is changing the face of the school drop-off. At many London primary schools, alongside the children arriving in cars, on bikes and by scooter, a growing number are arriving strapped into the boxes of cargo bikes — high-capacity, load-carrying bikes or tricycles that are also an increasingly popular, environmentally friendly and efficient means of delivering goods.
The rise of this apparently old-fashioned technology suggests to many observers that, despite the widespread excitement about the prospects for self-driving cars and air taxis, the future of mobility in old and crowded cities is likely to be powered by human muscle.