People born after 1980 may find it hard to believe that credit cards were not always ubiquitous.

Credit has existed for many decades, but not the buy-anything-anywhere-anytime variety familiar to us today.

The history of credit cards begins in the farming communities of the early 20th century. Bankrate takes you on a journey from that era to the present day.

History
Credit cards revolutionized payment processes. Here’s how.
The life story of credit cards
  1. The old general store.
  2. Diners Club.
  3. Emergence of Visa, Mastercard.
  4. Banking deregulation begins.
  5. Profitability in the 1980s.
  6. Rewards a big hit.
  7. Credit card reforms.
  8. The future of credit cards.

The old general store

You’ve seen this in literature or movies depicting the agrarian society of about a hundred years ago. In rural areas, the proprietor of the local general store would extend credit to regular customers, and department stores in the cities would do the same.

“When we were more of a farming nation, we were driven by credit. Every sort of general store in the more rural regions extended credit to farmers and others as a cost of doing business,” says Lewis Mandell, Kermit O. Hanson visiting professor of finance and business economics at the Foster School of Business, University of Washington. “In those days, credit was sort of a generalized, open-book credit.”

The merchant would just record in his ledger the amount the customer owed, he explains. “The more farming-oriented a society was, the more dependent they were on consumer credit.”

As urbanization grew, department stores made credit available to more middle-income customers, and it became necessary to implement a different method to track customer accounts. The first credit cards were simply made of cardboard or paper. Sometime in the 1920s, embossed metal plates like those of Army dog tags were introduced.

“They could actually run these through a little roller and get a copy of the customer’s card,” Manning says.

The cards were generally associated with only one vendor, for instance, Macy’s or Bloomingdales. The world had to wait until 1949 for the first universal card to be introduced: Diners Club.

Diners Club

The Diners Club card evolved from a simple idea: that it would be nice to have a substitute for cash or a checkbook that could be used at more than one place.

“At the time, individual stores issued charge cards — something like credit cards that would allow you to pay on installments. But if you were shopping, let’s say, in New York, you would have to carry around many different cards. So this created a card that people could use at many different merchants,” says David S. Evans, economist and author of “Paying with Plastic: The Digital Revolution in Buying and Borrowing.”

The general-purpose Diners Club card was a big innovation, says Evans. The original card was a charge card, meaning the balance had to be paid at the end of the month.

Competitors entered the market shortly thereafter, such as the American Express card in 1958 and Carte Blanche, issued by Hilton Hotels Corp.

Emergence of Visa, Mastercard

Banks also issued their own credit cards throughout the 1950s. Visa got its start as a bank-issued card in 1958, although it was called BankAmericard, named after Bank of America, then a California bank.

“Back when there were no national banks, only state or local banks, Bank of America introduced a credit card — a true credit card where you could pay things off over time,” says Evans.

“Then, in 1966, they set up a national franchise program. Banks around the country were licensed to issue the BankAmericard credit card. That was a pure franchise organization where Bank of America was the franchiser, like McDonalds, and the other banks were franchisees issuing the cards.

For various reasons, franchisee banks chafed under the agreement. Evans says the franchisees rebelled, leading to the elimination of the franchise organization and the institution of a “true cooperative where Bank of America was just one of many banks that had a share in the cooperative. And that was the birth of Visa,” he says.

BankAmericard officially became known as Visa in 1976.

Mastercard also came about as the result of a cooperative effort between various banks.

“Mastercharge, which was known as Interbank, really began as a group of banks largely in the Northeast that wanted to get together and honor each other’s cards so that people could use the card outside of the immediate bank area,” says Mandell.

“Don’t forget, back in those days, there were laws against interstate banking, so banks were really confined only to states in most cases,” he says.

Banks issuing credit cards needed to make sure the cards were widely accepted enough to make them useful. This meant they had to sign up merchants in addition to getting other banks to honor them.

For their part, merchants welcomed the cards, even though it cost them money.

“Merchants had a choice, basically,” says Mandell. “They could sign up or not sign up. And if they did sign up, they would have to pay a small part of their proceeds. Called the merchant discount, it started at 7 percent and started coming down fairly quickly thereafter.”

“Merchants were confronted with the notion of either pay this and have an opportunity to expand your business or don’t — pretty much the same as today when merchants are confronted with American Express,” he says, which charges a higher merchant discount than its competitors. “They can say, sure they charge a high merchant discount, but we’ll pay it because our margins are high enough and it encourages the kind of business we want. Or we won’t pay it.”

Banking deregulation begins

Before credit cards could really catch on, banks had to figure out a way to get around interest rate constraints imposed by interstate banking laws.

These laws prohibited banks in one state from lending money to people in another. States set their own interest rate ceilings, so an out-of-state bank couldn’t lend at their own higher rates to another state’s citizens.

In 1978, a Supreme Court ruling in the case of Marquette National Bank of Minneapolis v. First of Omaha Service Corp. changed that. The result: Nationally chartered banks could charge people in other states the interest rate set in the bank’s home state. Like miners to California during the gold rush, big banks flocked to states that had no cap on interest rates.

“It (the Marquette ruling) allowed banks to become national issuers of cards with more or less uniform interest rates, and allowed banks to set up shop in Delaware or South Dakota, where there are no limitations on interest rates,” says Evans.

This step toward deregulation, combined with liberalized interest rates, resulted in the expansion of credit, he adds.

Profitability in the 1980s

Despite the growth and development of the industry through the 1970s, the profitability of credit cards lagged. That would change in the 1980s.

“That is the modern phase of the distinction between the charge and the credit card. The timeline of that is really beginning at the ’82-’83 recession, because that was pretty much the end of the usury laws and with the dramatic reduction in inflation, credit cards became profitable again,” says Robert Manning, Ph.D., author of “Credit Card Nation.”

Interest rates had declined — though lending rates did not — and a period of stagflation came to an end, prompting consumers to feel more like spending.

“There was a big period of great profitability from 1983 to 1990, says Evans.

“Up until 1983, the banks were charging interest rates of 18 percent or so, but on the other hand their cost of capital was really big. So the spread they had between the money they lent and their own cost of capital was pretty small,” he says.

In some ways the early 2000s could be compared to the 1980s, with lots of borrowing and plenty of freewheeling spending.

“Everyone was feeling optimistic and engaging in a lot of borrowing. But over time these guys have good years and they have bad years — like now,” says Evans.

Rewards a big hit

In 1989, Citibank struck a deal with American Airlines to give consumers reward points, ushering in a new era for the industry.

“That was a big deal,” says Evans. The concept snowballed from there.

“It led to more people getting cards because they got rewards, but it also led to people wanting to use the cards more, because in addition to getting financing, they would get rewards,” Evans says.

It was also an era for fees.

“At first, fees were not considered to be a mainstay of the business. They were virtually an afterthought, I think,” says Mandell. “As the credit card companies got more and more competitive, they really started to target users who were most lucrative to them: people who carry a balance from month to month,” he says.

In their quest for new customers, card issuers offered teaser interest rates of zero percent for balance transfers, and savvy credit card users took them up on their offers to save money.

“So the card companies, through competition, ended up reducing the usual sources of income,” says Mandell. “Many — if not most — gave up annual fees in the ’80s. So now, without the annual fees, they were dependent on merchant discounts, which were very low because of competition. Then they began squeezing those and began looking around for other sources of income,” he says.

Fees turned out to be pretty lucrative for credit card companies, and they never looked back.

Credit card reforms

In the present, credit cards have developed a reputation for some ethically questionable — from a consumer standpoint — business practices. So much so that regulators moved to rein in credit card companies very recently.

On Dec. 18, 2008, the Office of Thrift Supervision, the Federal Reserve Board and the National Credit Union Administration adopted a set of regulations to protect consumers from the most egregious lending practices, such as double-cycle billing and universal default. The regulations will go into effect July 2010.

Included in the reform package are rules that prohibit raising interest rates on current balances — except index-related movement on variable-rate cards — and disallow rate changes in the first year a card is open unless the rate change was disclosed at the beginning of the business relationship.

In the midst of a credit recession, card issuers say the new rules will hurt their bottom lines and affect the interest rates and credit limits that customers receive.

The future of credit cards

Despite the belt tightening currently going on in the industry and the economy at large, credit cards won’t be disappearing anytime soon.

“Credit cards won’t ever go away. Those of us who travel for a living realize how dependent we are on them. But the nation has to learn how to save again,” says Mandell.

Evans predicts that the next few years will be rocky for credit card companies as the country re-evaluates its saving and spending habits.

“Most people are skittish about adding more debt now, and banks are skittish about lending money to people who may not be able to pay it back. And the people who are most interested in using credit cards at the moment are precisely the people that the banks don’t want on their books,” says Evans.

“I would think that consumers would probably switch to using credit cards, but paying their bill off at the end of the month. Or not using credit cards and using debit cards or using some other payment option that doesn’t involve financing,” he says.

Bankrate’s recent credit card poll backs up that hypothesis. Though half of Americans with credit cards said they don’t plan to change the way they use them in 2009, 32 percent plan to charge less and 15 percent say they won’t use credit cards at all

Even as credit cards take a less prominent place in our collective wallets, new technologies could be on their way that will make credit cards easier and more convenient to use than ever.

“I think magnetic stripes will be replaced with other digital mediums that are much more secure,” says Mandell. “In places like Japan, people can already buy stuff utilizing their phones.”

“I think actually the widespread use of debit cards at point of sale has eroded some of the business that might have gone to credit cards, and I think that the notion of a debit card with a line of credit in back of the account is a very good substitute for credit cards and something that will be used more in the future,” he says.

“Banks might even be able to relate it to some collateral, like a home equity line. They are doing this already, but it tends not to be widely used at this point,” says Mandell.

However, as the use of credit evolves, the cards themselves likely will be more useful.

“Credit cards are going to become increasingly a mechanism for information transfer. As American credit cards catch up with their use in other parts of the world, the smart card technology will be used in many different ways. So I think the credit card will become far more multifaceted and multifunctional,” says Manning.

Smart card technology has been around since the 1970s, but the current form has been used in Europe and Asia for about a decade. They are just beginning to gain acceptance in the United States.

Instead of using a magnetic strip to store information, smart cards contain a microprocessor that stores loads of information. They can be used for security, as bank or credit cards, ID cards and to store health insurance information. They don’t even have to be cards. As mentioned above, the Japanese are finding that phones make an ideal medium.

“It will have all your different financial service accounts that you can go back and forth with, it could keep track of transactions and it will interface with other store databases. There are so many possibilities in the future,” says Manning.

The near-term outlook for credit cards may be somewhat grim, but the long term holds promise for the industry, and for consumers who are savvy about how they use them.

Promoted Stories