Tuesday, September 6, 2016

Math is racist and the dog in the grill

It's about the math in life you may not use it but they sure do out there,
without you knowing it most of the time.

For me in the 90's I worked at a hotel and found out they
had a dead peasant insurance policy on me when HR talked to me
wanting to know why they where spending so much on me a 1.5 million
policy. That is not big now but back in 1990 it was big!
At the time I was the second highest in my state.
My co-workers called me a dead man walking! And I am still here!
And I also have to note it helped me get jobs being the cost they where
paying for the insurance! It showed value!

Also bad is unknown's, those things that are that unknown when they
come back later it's funny as hell! I had a co-worker that was
micro criticizing me as in she didn't like the way I ate, didn't like the
way I sat in the chair, didn't like the way I sneezed etc...
She was bipolar or OCD or something. Many workers knew she was
messed up and many got annoyed to the point most of the workers
sat in the break room once with bags of chips during lunch.
When she came in there everyone grabbed a chip and crunched
at the same time. The room was loud and it was all at the
same time. CRUNCH... CRUNCH... CRUNCH... etc.
She was just standing there with her eyes wide open, in a personal hell!
She ran to HR and we all ran for it!

She kept a note pad with her a lot and I am sure she wrote what happened down.
One day she was on me again so I let her have as much BS as she could take.
I told her I cooked a dog on a grill, I just threw him in there and shut the lid fast.
He yelled a lot and the dog was ok a bit fatty but not that bad. I went on and
on until she told me no more all the time she was writing it down.
I thought I burned her out finally, Keep in mind the view in the 90's about
workers that where like that was to "let the crazies be crazy can't make them not."
They will burn out and leave sometime just hang in! Well she finally did.

What I didn't know was she took what I said to the NLRB and it ended up
being put to one of those unknown files tagged to me. I only found out five or so
years later when I went for another job the interviewer the boss saw the
background check file. He asked me if I cooked a dog! LOL!
He told me it sounded like I was a killer and he was wondering why I
was running loose but he figured it was BS when he got to the part of
the dog. I told him what I did and he told me the lady sounds like she
had problems!

The point of that? Well there are things out there you don't know about.
Like the math of things! Like having a credit check along for a job interview.
That is bad if you are poor because it takes money to have credit.
If you can't afford it don't get it! Or you try to get it but the banks underwriter
says "Sorry your income is too low!" so you don't get it and get no credit!

Life is crappy, but to ask did you need it anyway? Live within your means!
I do without and life is good without the hoopla! I know they will have to raise
the pay you can't have most of us wiping out their stores staying home doing without.

~~~~~Math is racist: How data is driving inequality
In a new book, "Weapons of Math Destruction," Cathy O'Neil details all the ways that math is essentially being used for evil (my word, not hers).

From targeted advertising and insurance to education and policing, O'Neil looks at how algorithms and big data are targeting the poor, reinforcing racism and amplifying inequality.

These "WMDs," as she calls them, have three key features: They are opaque, scalable and unfair.

Denied a job because of a personality test? Too bad -- the algorithm said you wouldn't be a good fit. Charged a higher rate for a loan? Well, people in your zip code tend to be riskier borrowers. Received a harsher prison sentence? Here's the thing: Your friends and family have criminal records too, so you're likely to be a repeat offender. (Spoiler: The people on the receiving end of these messages don't actually get an explanation.)

The models O'Neil writes about all use proxies for what they're actually trying to measure. The police analyze zip codes to deploy officers, employers use credit scores to gauge responsibility, payday lenders assess grammar to determine credit worthiness. But zip codes are also a stand-in for race, credit scores for wealth, and poor grammar for immigrants.

O'Neil, who has a PhD in mathematics from Harvard, has done stints in academia, at a hedge fund during the financial crisis and as a data scientist at a startup. It was there -- in conjunction with work she was doing with Occupy Wall Street -- that she become disillusioned by how people were using data.
"I worried about the separation between technical models and real people, and about the moral repercussions of that separation," O'Neill writes.

She started blogging -- at mathbabe.org -- about her frustrations, which eventually turned into "Weapons of Math Destruction."

One of the book's most compelling sections is on "recidivism models." For years, criminal sentencing was inconsistent and biased against minorities. So some states started using recidivism models to guide sentencing. These take into account things like prior convictions, where you live, drug and alcohol use, previous police encounters, and criminal records of friends and family.

These scores are then used to determine sentencing.
"This is unjust," O'Neil writes. "Indeed, if a prosecutor attempted to tar a defendant by mentioning his brother's criminal record or the high crime rate in his neighborhood, a decent defense attorney would roar, 'Objection, Your Honor!'"

But in this case, the person is unlikely to know the mix of factors that influenced his or her sentencing -- and has absolutely no recourse to contest them.

Or consider the fact that nearly half of U.S. employers ask potential hires for their credit report, equating a good credit score with responsibility or trustworthiness.

This "creates a dangerous poverty cycle," O'Neil writes. "If you can't get a job because of your credit record, that record will likely get worse, making it even harder to work."

This cycle falls along racial lines, she argues, given the wealth gap between black and white households. This means African Americans have less of a cushion to fall back on and are more likely to see their credit slip.
And yet employers see a credit report as data rich and superior to human judgment -- never questioning the assumptions that get baked in.

In a vacuum, these models are bad enough, but O'Neil emphasizes, "they're feeding on each other." Education, job prospects, debt and incarceration are all connected, and the way big data is used makes them more inclined to stay that way.

"Poor people are more likely to have bad credit and live in high-crime neighborhoods, surrounded by other poor people," she writes. "Once ... WMDs digest that data, it showers them with subprime loans or for-profit schools. It sends more police to arrest them and when they're convicted it sentences them to longer terms."

In turn, a new set of WMDs uses this data to charge higher rates for mortgages, loans and
insurance. So, you see, it's easy to be discouraged.
And yet O'Neil is hopeful, because people are starting to pay attention. There's a growing community of lawyers, sociologists and statisticians committed to finding places where data is used for harm and figuring out how to fix it.

She's optimistic that laws like HIPAA and the Americans with Disabilities Act will be modernized to cover and protect more of your personal data, that regulators like the CFPB and FTC will increase their monitoring, and that there will be standardized transparency requirements.

And then there's the fact that these models actually have so much potential.
Imagine if you used recidivist models to provide the at-risk inmates with counseling and job training while in prison. Or if police doubled down on foot patrols in high crime zip codes -- working to build relationships with the community instead of arresting people for minor offenses.

You might notice there's a human element to these solutions. Because really that's the key. Algorithms can inform and illuminate and supplement our decisions and policies. But to get not-evil results, humans and data really have to work together.

"Big Data processes codify the past," O'Neil writes. "They do not invent the future. Doing that requires moral imagination, and that's something only humans can provide."
http://money.cnn.com/2016/09/06/technology/weapons-of-math-destruction/index.html