Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
AI Businesses Privacy The Almighty Buck United States Technology

Credit Scores Based On AI and Your Social Media Profile Could Usher In New Way For Banks To Discriminate (vice.com) 203

Credit scores have a long history of prejudice. "Most changes in how credit scores are calculated over the years -- including the shift from human assessment to computer calculations, and most recently to artificial intelligence -- have come out of a desire to make the scores more equitable, but credit companies have failed to remove bias, on the basis of race or gender, for example, from their system," writes Rose Eveleth via Motherboard.

While credit companies have tried to reduce bias with machine learning and "alternative credit," which uses data like your sexual orientation or political beliefs that isn't normally included in a credit score to try and get a sense for how trustworthy someone might be, Eveleth says that "introducing this 'non-traditional' information to credit scores runs the risk of making them even more biased than they already are, eroding nearly 150 years of effort to eliminate unfairness in the system." From the report: Biases in AI can affect not just individuals with credit scores, but those without any credit at all as non-traditional data points are used to try and invite new creditors in. There is still a whole swath of people in the United States known as the "unbanked" or "credit invisibles." They have too little credit history to generate a traditional credit score, which makes it challenging for them to get loans, apartments, and sometimes even jobs. According to a 2015 Consumer Financial Protection Bureau study, 45 million Americans fall into the category of credit invisible or unscoreable -- that's almost 20 percent of the adult population. And here again we can see a racial divide: 27 percent of Black and Hispanic adults are credit invisible or unscoreable (PDF), compared to just 16 percent of white adults.

To bring these "invisible" consumers into the credit score fold, companies have proposed alternative credit. FICO recently released FICO XD, which includes payment data from TV or cable accounts, utilities, cell phones, and landlines. Other companies have proposed social media posts, job history, educational history, and even restaurant reviews or business check-ins. Lenders say that alternative data is a benefit to those who have been discriminated against and excluded from banking. No credit? Bad credit? That doesn't mean you're not trustworthy, they say, and we can mine your alternative data and give you a loan anyway. But critics say that alternative data looks a lot like old-school surveillance. Letting a company have access to everything from your phone records to your search history means giving up all kinds of sensitive data in the name of credit. Experts worry that the push to use alternative data might lead, once again, to a situation similar to the subprime mortgage crisis if marginalized communities are offered predatory loans that wind up tanking their credit scores and economic stability.

This discussion has been archived. No new comments can be posted.

Credit Scores Based On AI and Your Social Media Profile Could Usher In New Way For Banks To Discriminate

Comments Filter:
  • by Anonymous Coward

    It could also be made illegal so that it does not happen. It will never happen in the bought and paid for USA but in sensible countries like the rest of the first world it could be stopped.

  • by mschuyler ( 197441 ) on Thursday June 13, 2019 @08:02PM (#58758702) Homepage Journal

    Their "social credit" score is far beyond our mere FICO scores. No, I'm not happy about this, but it's obvious that's where we're headed anyway. Just wait ten years.

  • I don't get what the problem is around banks figuring out better ways to discern who is a good risk for giving money to, and who is not...

    Indeed, this new mechanism described in the summary could well be a lot less prone to problems with racial stereotypes, as an AI looking at your social media profile wouldn't care what color you were, just how people regard you anyhow you treat others, rather than some possibly biased human agent making choices about a loan for you based on perception with few facts about

    • by burtosis ( 1124179 ) on Thursday June 13, 2019 @08:19PM (#58758764)

      Indeed, this new mechanism described in the summary could well be a lot less prone to problems with racial stereotypes, as an AI looking at your social media profile wouldn't care what color you were, just how people regard you anyhow you treat others, rather than some possibly biased human agent making choices about a loan for you based on perception with few facts about who you are.

      So who programs this AI and what data sets is it trained on? An algorithm powered by weak AI can easily descriminate and have plausible deniability at the same time. The programmers and training sets are reasons why there have been massive descrimination in the past, intentional or not.

      • So who programs this AI and what data sets is it trained on?

        You can say that about EVERY human lending agent at every bank. Only they are also prone to natural human bias to not trust as much people who seem significantly different than themselves.

        If the tables were flipped and it was the AI giving out loans today you would think it was bonkers to give the choice of who gets loans out to a fleet of a hundred thousand randos, each with individual biases and also introducing the possibility of corruption (se

      • by Anonymous Coward

        This was done, very effectively, 20 years ago by the Swift Group in England. They evaluated sub-prime mortgage candidates, very effectively, for the risk of their loans. They were dispassionate. They did not bring racial stereotypes to the table, but judged only on the data. And if the banks of England had used them more widely, they would not have suffered the sub-prime mortage crisis. But the agents were too busy making their quarterly numbers, and reselling the mortgages, to say "slow down!!!! some of th

    • by Anonymous Coward

      I don't get what the problem is around banks figuring out better ways to discern who is a good risk for giving money to, and who is not...

      This is the whole "reality is biased" bullshit again. Computers have no bias. If they determine that a certain bit of information is a good signal for making a decision, THAT'S BECAUSE IT IS, NOT BECAUSE THE COMPUTER IS RACIST.

      I'm getting so sick and tired of this "we must fix reality to meet our biases, rather than accept certain uncomfortable truths." Computers have no biases. AI have no biases. If they've discovered that they can reliably predict risk based on certain information that just so happens to

      • Computer models can be wrongly biased, especially in light of spurious correlation. Vision AI will sometimes think a random image of static is a cat. That being said not every inequality is evidence of bias.

    • is when racism is built into your institutions. The most famous example is when Crack Cocaine carries stiffer penalties than powered cocaine, but there's much less blunt examples. You can "redline" black neighborhoods for home loans based on zip code, so that even when a family gets good jobs they still end up segregated because they can't get the kinds of low interest loans needed to move to nicer neighborhoods.

      First, let's be clear, we're not talking about AI, we're talking about very, very complex co
      • Re: (Score:1, Informative)

        by Anonymous Coward

        The most famous example is when Crack Cocaine carries stiffer penalties than powered cocaine, but there's much less blunt examples.

        WTF does this have to do with anything?

        You can "redline" black neighborhoods for home loans based on zip code, so that even when a family gets good jobs they still end up segregated because they can't get the kinds of low interest loans needed to move to nicer neighborhoods.

        Redlining has been illegal for years. You can't do that any more. It's not a thing.

        First, let's be clear, we're not talking about AI, we're talking about very, very complex computer models.

        Here, let me quote literally the first line of the summary for you:

        most recently to artificial intelligence

        We're literally talking about AI.

        It's basically why blacks are as badly off as they are when taken as a whole. They were excluded from a good chunk of the New Deal and the social programs of the 60s & 70s.

        (checks calendar)

        Look at that, it's still the 21st century. What does crap that happened 60 years ago have to do with today?

        The bottom line is that these new computer AI models can't have bias because they're based on nothing but math. Unless you want to claim that math itself is somehow b

      • The most famous example is when Crack Cocaine carries stiffer penalties than powered cocaine

        It's famous and it's wrong. Black leaders supported the stricter crack cocaine laws because they saw what crack was doing to their communities. [wnyc.org]

        • You're setting up an institution that disproportionately impacts a certain racial group. To be fair the fact that everyone knew that was going to happen leans it on the side of overt racism, but it's still building racism into your institutions, intentional or not.



          Also they were duped by propaganda. It's not the surprising. Especially in the 80s with limited information available and fearmongering running high.
    • Yeehaw - kick the poor!

      If banks don't help make the rich richer and the poor poorer, who will?! Long live the class system! Those filthy deplorable proles DESERVE absolutely everything they do to be harder and more expensive! Housing and employment discrimination are the best! Ignore the teachings of Jesus and most other spiritual leaders throughout history - moneylenders are super duper awesome and should be allowed to control every aspect of life!

      In summary: I've got mine, so screw you Jack! Fuck you, p

    • I don't get what the problem is around banks figuring out better ways to discern who is a good risk for giving money to, and who is not...

      Indeed, this new mechanism described in the summary could well be a lot less prone to problems with racial stereotypes, as an AI looking at your social media profile wouldn't care what color you were, just how people regard you anyhow you treat others, rather than some possibly biased human agent making choices about a loan for you based on perception with few facts about who you are.

      Banks cannot just give out money to everyone, or they will collapse. So there has to be SOME way for them to chose who gets lended money. Again, why is a mechanism less prone to human fallibility and individual judgment worse in any way? The new system will also be more even-handed.

      The problem is when the mechanisms are accurate, but result in disparate outcomes. People don't like that, we have cultural bugaboos about it, so we have to pretend that they aren't accurate (or that they could be practically more accurate, fallacy of the false alternative).

  • 27 percent of Black and Hispanic adults are credit invisible or unscoreable (PDF), compared to just 16 percent of white adults

    They should talk to a loan agent of some sort, who could identify these 'unscoreable' [youtu.be] people in the first place. Then they could be given an identifying mark for future reference.

    • 27 percent of Black and Hispanic adults are credit invisible or unscoreable (PDF), compared to just 16 percent of white adults

      They should talk to a loan agent of some sort, who could identify these 'unscoreable' people in the first place. Then they could be given an identifying mark for future reference.

      No, someone needs to explain to the people who are "credit invisible" that IF (small word, big meaning) they wish to establish some reasonable capability to borrow money, they need to, well, borrow some mone

  • People already use one sanitized and "fake" account for work/business and one for actual use. It's not a matter of permissions, they can dig as far as they want and it's clean. Then banks/business think they are getting the real deal when it's just more layers of BS, maybe that is part of responsible lending but I doubt it. The future is looking like privacy can only be held by increasing the noise and burying the signal - a bit like if you are forced to use an Amazon echo you could just let it listen
    • No problems here (Score:5, Informative)

      by SuperKendall ( 25149 ) on Thursday June 13, 2019 @09:18PM (#58758936)

      People already use one sanitized and "fake" account for work/business and one for actual use.

      Someone willing to put in the work of maintaining a clean account sounds like an excellent credit risk to me. The lazy or inept simply would not bother.

      Also good luck actually keeping the two separate with facial recognition tie-ins across all accounts. The link will be outed - if not by yourself, then by the world around you.

  • by BrendaEM ( 871664 ) on Thursday June 13, 2019 @08:38PM (#58758820) Homepage
    You cannot state that someone is a credit risk without proof.
    • by Anonymous Coward

      Everyone is a credit risk, this is just assigning a score but the risk is 3 way. 1. total loss, 2. part loss 3. no loss.

      If you live on the street and have a needle sticking out of your arm then I would put you closer to 1. If you have had a good job for 20years and have your own home mortgage free I would put you closer to 3. I have no proof, only evidence (you may not know the difference.)

    • BrendaEM is a credit risk.

      This comment shall be forever archived on the internet, for A.I's to find, tie to your real identity, and then factor in how much risk.such public statements about you.

      Deal with it, ignorant twat. I suggest trying to deal with it by suing me for libel, because the A.I's will love to also factor in how frivolous lawsuits about someones credit worthiness is very informative.
    • Correct, under the old & virtuous Common Law, the practice of "credit scoring" would be considered slander. Perhaps racketeering as well. Quite illegal.

      However the wicked business practices of the big 3 Social Credit rating companies are exempted from the Common Law by explicit statutory authorization. The statute is euphemistically named the Fair Credit Reporting Act.

      Will these upstart Social Credit rating companies be able to comply with the FCRA? Doubtful. Most industry-written laws are intended

    • You cannot state that someone is a credit risk without proof.

      You cannot state that someone is a good target for lending without proof.

      That is what the "credit companies" are offering. The negative of that, which is what you are stating, is not something they are claiming, even if it can be inferred.

  • If X percent of people who are like Y have bad credit, that's not bias; that's reality.

    Let's face it, poor people will have bad credit. Rich people can have bad credit too, it's just less likely. But I'm sure the likelihood of a rich person having bad credit is about the same as a poor person having good credit.

    • What the fuck do you think bias is? X percent of people who Y are Z is a correlation. Unless it's causal at near 100%, what you are doing is being biased assuming Y causes Z. Of course, what you claim to be doing is assessing different likelihood, but that probability turns into a credit score. Your philosophy lets people get their credit scores dinged for being black, or whatever.

  • by Antique Geekmeister ( 740220 ) on Thursday June 13, 2019 @09:57PM (#58759072)

    > While credit companies have tried to reduce bias with machine learning and "alternative credit,

    As politically incorrect as this may be to say, being too young, too old, black, female, transgender, or medically disabled are all tangible proof of increased credit risk. Being born healthy, Jewish or Asian, male, young, and with a wealthy family all are tangible proof of reduced credit risk. They may also each be individually or collectively illegal to discriminate on the basis of, but they are all _tangible credit risks_.

    Banks, landlords, schools, and many other businesses are heavily regulated to present discrimination, but many forms make powerful and effective business sense. Even simple correlations, such as the likelihood of single motherhood, are factors not allowed for discrimination but likely to devastate the finances of a loan applicant. So there is no surprise, and should be no surprise, that lenders would seek information to reduce their risks even if it is tied to factors that are illegal to discriminate with.

    • Perhaps it's time we the people ask, which do we value more: moneylenders' profits, or macrosocial cohesion?

      • Perhaps it's time we the people ask, which do we value more: moneylenders' profits, or macrosocial cohesion?

        Or we could ask "do we really want anyone lending to anyone else?" Because if a lender can't legally make a decision like "a single mother is more likely to default on a loan than a married couple (of whatever ethnic or otherwise protected group)", then the obvious solution is to either:

        a) stop lending money altogether, which hurts everyone (but especially the poor), or

        b) raise interest rates acr

        • "stop lending money altogether, which hurts everyone (but especially the poor)"

          Naw, brohamley. Outlawing usury would be immensely beneficial for the poor and the working class. Not only would the people be relieved of the parasites sucking their blood every month - but also, the cost of houses would drop like a rock.

          It's always the rich - including the sovereign - who benefit from moneylending. It's always the toiling masses who pay for that benefit in blood, sweat, and tears.

  • by Gravis Zero ( 934156 ) on Thursday June 13, 2019 @10:04PM (#58759110)

    Neural networks develop superstitions (just like people do) by correlating unrelated coincidences in information. If they apply NNs to social media data, it could develop superstitions as silly as someone not having a picture of their car or as racist as the color of your skin. The real problem with NNs is that they don't know the what is being used to determine if someone is a higher or lower risk. My real concern is that not having a social media account will default to believing you are high risk.

    • by Chrisq ( 894406 )

      Neural networks develop superstitions (just like people do)

      I don't. I've got a lucky charm to prevent it.

    • by AmiMoJo ( 196126 )

      AI should not be treated as a black box. In most cases it isn't, there are known inputs and different sub-AIs to process them, e.g. the AI module that handles object recognition in photos will be different from the one that handles loan repayment history data. The outputs of all those are then weighted and combined, and all that information can be recorded and presented on request.

      Make it a legal requirement for people to know how and why decisions about them were made (as it is under GDPR) and it will be n

      • Make it a legal requirement for people to know how and why decisions about them were made (as it is under GDPR) and it will be near impossible for companies to hide biases inside AI black boxes.

        Sounds great. How do you plan on getting Republicans on board? Just sayin'.

        • by AmiMoJo ( 196126 )

          I think you will have to wait until the Democrats control enough of the government to force it through.

    • Neural networks develop superstitions (just like people do) by correlating unrelated coincidences in information.

      I think you are right that correlation can be a problem, but I don't think superstition is the right metaphor. Machine learning will latch onto correlations between the features and the label, but it will still have high accuracy. The problem is that this correlation might only exist in this data. Now machine learning is not dumb enough to just use any correlation found in the training data.

      • AI should not be treated as a black box. In most cases it isn't,

        Sure, other than the self-learning/machine learning ones. Those that learn on big sets of data are not human readable. The box is inherently black by nature. You can't open it up and deduce why the weight on node201 is 0.12412352. The AI itself certainly doesn't know why any more than you can explain what your 293,395th neuron does. The reason it's at that value is because it was better at identifying who to give loans to. For that given data set. And a healthy dose of rand() thrown in for good measure.

        The

  • You want to convince someone else to hand you a big chunk of money on a promise that you'll pay them back? You need one of two things:

    1. A big bank account and a good credit and employment history that shows you can pay it back. This is basically your standard credit score.

    or

    2. Give enough personal info that tells another person that you're honorable and reliable. Give up this much info, it's gonna feel like you just submitted to big brother. You're not gonna like giving up that much info.
  • "Unbanked"? (Score:4, Interesting)

    by imperious_rex ( 845595 ) on Thursday June 13, 2019 @10:16PM (#58759154)
    Maybe it's because I've had at least one checking account since I was 17 (when I got my first job), but I have never been able to wrap my brain around how/why someone can function in a modern society being "unbanked" without a traditional bank or credit union account. A checking and savings account is essential to being financially functional, as one can't rely on cold hard cash all the time time. How does an "unbanked" person pay apartment rent every month? Most rental companies only accept checks. How does an "unbanked" person pay the utility bills? Make an online purchase? I think that only by being homeless and eschewing goods and services could one live being "unbanked."
    • Agreed, in our totalitarian-financialist society it apparently basically impossible to live indoors without a bank account.

      However Federal Reserve statics as cited by Wikipedia (https://en.wikipedia.org/wiki/Unbanked) claim that unbanked unpeople are 7.7% percent of Americans. The percentage of unpeople rises above 20% in some locales.

      I really don't know how those unpeople survive. But I must suppose they get exposed to the sharp pointy parts of the financialist machine more than we do.

    • by havana9 ( 101033 )
      One could live with cash only if it's at the edge of society and a lot of trip at the post office. I know some elderly people that every month are going get the cash at the post office, pay the utility bills at the post office too and pay all cash. The clerks are trying to make them switch to a postal account, but there people are refusing. There is a small fee, but this is compensated by not paying the extra on bills payment, and reducing the mugging risk. On the other hand one could live without a credit
    • If you don't have a bank, you probably go to the supermarket or other bill pay site to pay your utilities and cash your paychecks. Even if you do have a bank you may still go to the supermarket to pay utilities. I've done it.

      But given that all you need to open an account is money, it's really just a matter of getting a paycheck and taking it to a bank instead of paying extra to cash it at a check cashing place.

    • How does an "unbanked" person pay apartment rent every month? [...] pay the utility bills? Make an online purchase?

      The answer, in my experience, is that they pay more than the rest of us to do those things.

      Yes, there are prepaid debit cards and alternative banking businesses that will issue you a debit card. I live near Dallas, TX now; apartment complexes here almost always charge a percentage fee if you pay rent with a card. I've yet to see an apartment complex that accepts cash. So, yes, if you can't pay by check or bank draft, you can load the money onto a debit card (which may involve a fee) and pay your rent that w

    • by dasunt ( 249686 )

      Maybe it's because I've had at least one checking account since I was 17 (when I got my first job), but I have never been able to wrap my brain around how/why someone can function in a modern society being "unbanked" without a traditional bank or credit union account.

      Banks share a blacklist of people who have overdrawn accounts. It is quite possible to be blacklisted from getting an account at most banks.

    • Being "unbanked" certainly makes all the things you mentioned more difficult, but not necessarily impossible. I've seen long lines of people waiting at the electric company office to pay their bills in cash. I know when auto taxes are due because there is a line out the door of the county tax collector's office every month. No one HAS to go through a rental company to find a place to live. There are plenty of rooms for rent or homes owned by individuals who may prefer cash over hoping a tenant's rent check
    • by mjwx ( 966435 )

      Maybe it's because I've had at least one checking account since I was 17 (when I got my first job)"

      I live in a country where no-one can be "unbanked" except by their own volition. There are zero fee bank accounts that have to be offered under government regulations. Of course these accounts are also very low feature, most are just a card but even someone who has been convicted of multiple counts of bank fraud can get a basic bank account here in the UK, albeit a very thoroughly monitored one.

      but I have never been able to wrap my brain around how/why someone can function in a modern society being "unbanked" without a traditional bank or credit union account.

      Having any unbanked except by their own volition (I.E. paranoid crackpots) usually indicates that a society either

    • by eth1 ( 94901 )

      It's been a while since I've had to rent, but most places will also take money orders, which you can get with cash.

      Of course, if you don't have a bank account and get the money order at your bank, getting the money order will cost you a fee, so really it's just another way people too poor to function "normally" end up having to pay more.

    • You have never been truly poor. Here is how I know:

      Rent: Money Order(s)

      Utilities: Money Order or some supermarkets accept payments at their Customer Service counter

      Online Purchase: Visa gift card or give a friend some money and have them order it for you

      There are ways to do everything with cold hard cash

      Being poor is a form of torture with how society is set up. I am sure that result is intentional to "encourage" compliance. :(

  • by cascadingstylesheet ( 140919 ) on Friday June 14, 2019 @06:39AM (#58760096) Journal

    The unsolvable "problem" is that different groups have different average characteristics and outcomes in various areas of life.

    You can argue about what the reasons for that are - I'd hate to go all "anti-science" and claim that brains are biological or anything - but regardless of why, it's a fact. A fact that isn't due to "racism" so it will not be budged by changes in "racism".

    It can only be budged by engaging in racist policies of putting your finger on the scale to change the outcomes.

    • by eth1 ( 94901 )

      The unsolvable "problem" is that different groups have different average characteristics and outcomes in various areas of life.

      This isn't actually a "problem", just a fact - so long as you only deal with the average across the whole group.

      A problem only arises if you use that average to make decisions/assumptions about a specific individual member of that group.

  • When I first started thinking about my credit, which was later in life than it should have been, I had to get a bad power bill removed from it. If charges for power at a place I had already moved out of counted on my credit, why the talk about utility payments now?
  • "computer calculations, and most recently to artificial intelligence" -- same thing, new marketing spin, why don't people understand this?
  • The whole idea of credit scores is to introduce bias into the system of giving loans. They don't want to give loans fairly or evenly - they want to weed out anyone who is not going to pay or is not going to make them money. If you want the result to be biased, you're going to have to introduce data that is also biased (though maybe in unintended ways). The thing is, it doesn't really do much good to look at biases over groups of people (however you group them), because the system is unfair on an indivi

  • Gee! Who didn't see THIS coming! This AI crap was "invented" so to speak here, but, they knew what would happen if they rolled it out here, so they gave it to the commies in China, to work out the bugs. NOW they of course want to control the behavior of Americans....You think all these cameras in every flipping place you turn was for your safety? Now banks, government will have another way to control you. Say something on "social media" against (insert name), and your credit score goes down. Become a co
  • or we forclose on your house and good luck renting after we destroy your social media score. This is Groupthink on steroids. Someone needs to blow up the establishment before it gets off the ground.

Keep up the good work! But please don't ask me to help.

Working...