… if you’re not careful, modelling has a nasty way of enshrining prejudice with a veneer of “science” and “math.”Cathy has consistently made another point that’s a corollary of her argument about enshrining prejudice. At O’Reilly, we talk a lot about open data. But it’s not just the data that has to be open: it’s also the models. (There are too many must-read articles on Cathy’s blog to link to; you’ll have to find the rest on your own.)
You can have all the crime data you want, all the real estate data you want, all the student performance data you want, all the medical data you want, but if you don’t know what models are being used to generate results, you don’t have much.
Consider a company that is selling electronic devices. Let’s say that historically they have been selling well to companies that value their fast delivery and the quality of their product. As time passes, the competition grows and a global trend for green products arises. The profile of the company’s perfect customer slowly shifts and could go unnoticed by manually examining the market. However, those small shifts are identifiable by algorithms that continuously monitor the historical sales cycle of the company, cross-referencing it with external sources, like social media posts and newspaper articles discussing these trends, and finding correlations with the propensity to buy. Due to the size of this information base and its unstructured nature, monitoring all those delicate changes in real time becomes an almost impossible task for a human analyst.
There should be a German word for that feeling when you desperately need to be somewhere on time, and the traffic lights seem like they’re conspiring against you. When the greens feel especially short, the reds especially long, and both appear as if they’re in cahoots to keep you idling.
It’s not a conspiracy, but it is something called fixed time control. Most intersections, if they don’t have embedded sensors under the road, abide by traffic light timing determined by observed data from years or months past. Fixed time control based on historical data is usually pretty accurate for specific intersections, but what if entire cities coordinated their traffic lights to cut down on mass commute times and fuel use?
The thermometer showed a 103.5-degree fever, and her 10-year-old’s asthma was flaring up. Mary Bolender, who lives in Las Vegas, needed to get her daughter to an emergency room, but her 2005 Chrysler van would not start.
The cause was not a mechanical problem — it was her lender.
Ms. Bolender was three days behind on her monthly car payment. Her lender, C.A.G. Acceptance of Mesa, Ariz., remotely activated a device in her car’s dashboard that prevented her car from starting. Before she could get back on the road, she had to pay more than $389, money she did not have that morning in March.
Big data is affecting on-the-floor work among caregivers in numerous ways. One of them is in the decision-making process, bedside.
“With the advent of [electronic medical records],” said Mickey Lynch, director of commercial strategy and innovation at Cadient Group, “a physician has a much broader set of information upon which to establish a path forward. Meaning that now he or she can quickly view notes from previous visits, quickly access lab values and access test results perhaps administered by other physicians. All of which provides the physician with a far more robust clinical view of the patient.”
Another way big data is making changes — how medical facilities are managed and leveraged to optimize resources around patient-traffic trends.
“I no longer look at somebody’s CV to determine if we will interview them or not,” declares Teri Morse, who oversees the recruitment of 30,000 people each year at Xerox Services. Instead, her team analyses personal data to determine the fate of job candidates.
She is not alone. “Big data” and complex algorithms are increasingly taking decisions out of the hands of individual interviewers – a trend that has far-reaching consequences for job seekers and recruiters alike.