Credit scores are a fact of life for Americans in the 21st-century, but credit scores haven’t always existed.
Without credit scores, lenders might still be relying on handshakes and other subjective measures to determine credit-worthiness. Doing so would leave millions of Americans without access to the sort of capital necessary to buy a home or run a business. The invention and evolution of credit reporting helped shape the financial lives of generations of Americans.
So how did we get credit reports and credit scores as we know them today?
19th Century: The Beginning of Credit Reporting in America
For most of America’s history, decisions about who should be trusted to borrow money were based largely on the judgment of individual creditors and merchants, who sized up borrowers based on their reputation in their communities.
But as cities grew and agricultural activities gave way to more sophisticated industrial business enterprises, lenders and banks needed new ways to evaluate the worthiness of potential borrowers. So in the mid-1800s, merchants started compiling financial information about the reputations and habits of their customers. Then, they began trading that information with other merchants and lenders. Merchant associations compared notes on good customers and bad ones. Eventually, these merchant associations morphed into the first credit reporting agencies.
Early credit reports weren’t especially sophisticated. Credit reports in the 19th-century included subjective statements of opinion about the character or trustworthiness of potential commercial borrowers. No surprise, the opinions in those early credit reports reflected the class and race and gender biases of the established merchants and lenders of the day.
Credit reports helped merchants and lenders evaluate businessmen with whom they themselves might not be personally familiar. But final decisions about credit and lending were still based on the subjective judgment of individual lenders.
Early 20th-Century: Credit Reports for Individuals, Not Just Businesses
At first, credit reporting in America was just for businesses and potential business deals. Credit reporting and credit ratings for individual consumers didn’t really take off until the beginning of the 20th-century. Department stores and other retailers began extending credit to individuals in an attempt to encourage spending by America’s newly burgeoning middle class.
Retail credit managers in the early 20th-century began evaluating individuals using techniques established by commercial lenders. And that means that for the first half of the 20th-century, credit reports for individuals including not just information about a consumer’s financial behavior, but also his or her social and political behavior. Credit reports in the first half of the 20th century included information about a person’s race, sex, employment history, marital status, and sometimes even medical histories.
Mid-20th Century: The First Credit Scores, and the Fair Credit Reporting Act
Early credit reports included all sorts of information about individual borrowers. But they were missing one thing essential to modern credit reporting: an official credit score.
Different credit reporting agencies developed their own internal credit ratings or rankings over the decades. In 1956, engineer Bill Fair and mathematician Earl Isaac pitched the first standardized credit scoring system to American lenders. (Fair, Isaac, and Company is better known today as FICO. Even now, FICO still provides the credit scores used by most lenders in evaluating potential borrowers.)
Credit scores removed much of the subjective nature of credit-granting decisions. Scores allowed lenders an objective measure of the potential credit-worthiness of individual borrowers. A single standard for judging potential borrowers helped create access to credit for borrowers who had previously been shut out of traditional lending.
Still, credit reporting agencies remained controversial well into the 1960s. Credit reporting agencies focused largely on reporting negative information. They scraped newspapers for juicy stories and added personal details about the lives of individual consumers to their credit reports as a matter of routine. For most of the 20th-century, individuals were not allowed access to their own credit reports. So secret files containing personal details impacted the financial well-being of Americans for decades.
Credit reporting agencies began computerizing their files and systems. Public outcry led to a Congressional inquiry, and eventually to the Fair Credit Reporting Act in 1971. (Fun fact: the FCRA was basically the first federal digital privacy law.) The FCRA tightened protections for consumers, required credit reporting agencies to allow individuals access to their own credit history files, and gave consumers avenues to dispute and correct inaccuracies on their own credit reports.
21st Century: Free Credit Reports
Consumers have been able to access their own credit reports for decades now. But guaranteed access to your own, official credit scores is relatively recent. Until 2003, credit reporting agencies and creditors were only required to share the contents of your credit history. In 2003, the Fair and Accurate Credit Transactions Act amended the 1971 Fair Credit Reporting Act to require each of the three major credit reporting bureaus to provide a free copy of your credit report each year, and grant individuals access to the actual credit scores used by lenders and creditors.
Credit reporting may be a fact of life now. But it’s important to remember that the way things are isn’t the way they’ve always been. Or for that matter, the way they’ll always be.
Credit scores and credit reporting continues to change and evolve, even today. Understanding the history of credit reporting helps us see the ways that institutions and individuals shape the financial lives of everyday Americans. Better access to fairer credit allows businesses to thrive and homeowners to realize the American dream.