Algorithm ~

What is an algorithm? ~ computer code and complex structures underpinning: systems; apps; computer processes; technology.

 
Intricate and complex IT Technology

Intricate and complex IT Technology

understanding IT from the base ground upwards - an holistic approach.

“The Algorithm” is impenetrable.  It is mysterious, it is all knowing, it is omnipotent.  Except that, it is not. 

As an IT Expert Witness, one area which forms part of many Assignments is to understand the underlying computer code and complex structures underpinning many: systems; apps; computer processes; technology; and websites.

An algorithm is a simple concept that, today, has many complex manifestations.  Algorithms’ central and opaque position at the heart of social networks that like Facebook cause some to view algorithms in general with a sort of mystical reverence.  Algorithms have become synonymous with something highly technical and difficult to understand, that is an arbiter either of objective truth, or, on the other end of the spectrum, something wholly untrustworthy.

Nevertheless, when people refer to “the algorithm” - whether Facebook's or another tech company’s recommendation algorithm, or just “algorithms” in general - do they really know what it means?  Judging by how widely the term is used and misused, most likely not.  Embarking on an exploration of algorithms, we wanted to get something straight right off the bat: What is an algorithm, anyway?

Pedro Domingos, a computer science professor at the University of Washington who has also written a book about the ever-growing role algorithms play in our lives, believes that before you go being alternatively impressed by or distrusting of the next computer algorithm you encounter, get back to basics on the concept that is powering our world. 

1. An algorithm is a set of very specific instructions

How to bake a cake, find the sum of two plus two, or even run a country according to the U.S. Constitution, are all examples of algorithms.  Why?  Because, according to Domingos, the definition of an algorithm is “a sequence of instructions.”  That is it!

Today, an algorithm usually refers to “a sequence of instructions that tells a computer what to do.”  A computer program is an algorithm, written in a computer programming language, which a computer can understand and execute.  Computerised Algorithms have been around since the IBM of the 1960s, and continues in the same format, with executable code to produce “results”.

Algorithms written for computers also have to be extremely precise, often using instructions such as “if,” “then,” and “else.”  For example, a self-driving car might run on an algorithm for navigating that says, “IF the algorithmic directions say turn left, THEN turn left.”  See how specific you have to be to make a computer follow a seemingly simple set of instructions?

In the popular imagination, recommendation algorithms have come to dominate our idea of what an algorithm is.  That is, when many people think about or refer to algorithms, they are referencing something like a TV show Netflix thinks you might like, or which international travellers belong on the no-fly list.  While these are extremely complicated algorithms, at their hearts, they are still just a set of instructions a computer follows to complete a specified task. 

“With computers, the algorithm can get vastly more complex,” Domingos said.  “Addition is an algorithm that's defined in a few lines of text.  Computers can have algorithms that take millions of lines to define.”

2. People wrote and used algorithms long before computers even existed

As early as the Babylonian era, humans were writing algorithms to help them do the mathematical equations that allowed them to manage their agricultural society. 

“There were algorithms before computers, because you don't need a computer to execute an algorithm, the algorithm can be executed by a person,” Domingos said.

Algorithms using computers first rose to prominence in the mid-20th century, when the military began writing formulas for, say, determining where to aim a missile at a moving object.  The concept then moved into business administration, with computers running formulas for administering payroll and financials, and in science, for tracking the movements in the sky. 

A turning point for modern algorithms came when Larry Page and Sergei Brin wrote the Google PageRank algorithm.  Instead of just relying on information within a page to determine how relevant it was to a search term, the search engine algorithm incorporated a host of other ‘signals’ that would help it surface the best results.  Most notably, how many other links pointed to the article, and how reputable those articles were, based on how many links pointed to those pages, and so on.  The two believed that indicated a powerful sign of relevance.  And the rest is history. 

3. Today, you can find algorithms everywhere

We might think of algorithms as mathematical equations. Algorithms, according to Domingos, “can compute anything from anything, there might be no numbers involved at all.”  One prominent and extremely complex algorithm is the algorithm that governs the Facebook News Feed.  It is an equation that Facebook uses to determine what pieces of content to show its users as they scroll; in other words, a set of instructions to decide what goes on the “News Feed”.

“There's no end of things that Facebook could put on your News Feed but it has to choose,” Domingos said.  “And it's usually a combination of things like how much do you care about the people that produced directly or indirectly that post?  How close are they to you in your social network, how relevant it is in its own terms because of the subject, and also how recent?”

Facebook, Google, Apple, Amazon, and other big tech companies all rely on algorithms to serve content and products to their customers.  However, there are also algorithms throughout your life that you might not be aware of. 

For example, Domingos explained that an algorithm governs how your dishwasher knows when it’s time to transition from washing to drying, or how your car regulates fuel intake and knows when its tank is full while at the gas or fuel station, or how shadows appear in a digitally animated movie to perfectly replicate the sun in the real world.

“Clearly, every time you interact with the computer, or you're on the internet, there are algorithms involved,” Domingos said.  “But these days algorithms are also involved in just about everything.”

4. The most complex algorithms use Machine Learning

As we have noted, an algorithm typically has to be written in “excruciating detail” for a computer to understand what to do.  However, that is not the case when the people who write algorithms incorporate machine learning - a type of artificial intelligence - that leads to the most sophisticated algorithms.

“In traditional programming, a human being has to write down every little detail of what other details other have to do, and that is very time consuming, very costly,” Domingos said.  “Machine Learning is the computer discovering its own algorithms instead of being told what to do.”

Put another way, Machine Learning is when a computer programmer or software engineer feeds a program some raw data as a starting point, then submits the end point of what an organized, classified version of that data looks like, and leaves it up to the program to figure out how to get from point A to point Z. Consider an onion: A human who knows how to cook can turn that onion from a pungent raw sphere into strips of caramelized goodness.  In a traditional algorithm, a programmer would write every single step of the ‘cooking instructions’.  However, in an algorithm developed by Artificial Intelligence, given the end point as a goal, the program would figure out how to get from raw to caramelize itself.  Hence, the “machine learned”. 

These types of algorithms become even more powerful when a human being would not know how to get from point A to point B or Z.  For example, a human process like being able to recognize that “a cat is a cat “takes so much complicated brainpower that it would be impossible to write out step by step.  However, by giving a computer program a bunch of images of a cat, and images that are not a cat, and showing the desired endpoint as categorizing a cat image as a cat, the computer can learn to execute that process itself. There is a phase pat-way through the process when this is computerized “trial-and-error” which needs some human interaction or tempering to gain the desired result.

“It is the computer learning to program itself instead of having to be programmed by people,” Domingos said.  “This, of course, is extraordinarily powerful when it works, because now you can, you know, create very powerful, very complex algorithms with very little human intervention.”  It is also very funny when it does not work.

5. Despite the term’s recent cache, algorithms are not magic

Thanks to the sheer amount of data algorithms process, it might seem like they are all-knowing mystery boxes built to reveal secrets.  However, remember that an algorithm just means a set of computer or software instructions.  What’s more, humans create algorithms, which means they can be flawed. Over time, other factors also modify the algorithm and processes, for example system patches, updates, and fixes on the '“host” computer configuration. Some of these may produce unforeseen consequences, or inadvertently change the results structure. Experts can unpick these at the systemic level, and oftentimes have to, as part of their Expert Witness tasks.

“There's also a lot of misconceptions about algorithms, partly because people don't really see what's going on inside the computer,” Domingos said.  “A very common one is that people think that algorithms are somehow perfect.”

Computer Programmers spend enormous amounts of time fixing mistakes in algorithms so that the lines of code produce the appropriate results.  However, humans do not always catch those mistakes.  What is more, an algorithm is based around the output a human wants to see, or what that human is optimizing for.  Take a hiring algorithm, which ostensibly should find the best candidate for a job.  If a human sets the instructions to look at qualifications that are not necessarily relevant to a job (say, university pedigree, or sporting interests), just because the algorithm then says “candidate A is the best person,” does not make it the truth.

Often, that is because of bias.  Moreover, problems with bias can get even worse with algorithms that utilize Artificial Intelligence.

“In traditional programming you have to worry about the biases of the programmer,” Domingos said.  “In machine learning, mainly, you have to worry about the biases that come from the data.”

For example, a hiring algorithm powered by machine learning might use as its starting point a bunch of resumes of candidates, and as its output the resumes of people who were hired in the past.  However, most tech companies are not racially or culturally diverse.  Therefore, an automated algorithm that makes hiring recommendations could potentially mirror that real world inequality.

Studies have shown that artificial intelligence can mirror the gender and race stereotypes of the humans that train them, although this appears to be improving year-by-year.  In one study, an algorithm that produced word associations used the entirety of the English language on the web as its training data to learn associations between words.  Thanks to the biases that for centuries has existed in our world, the algorithm determined that female names were more associated with the arts, while male names were more associated with math and science.  Studies like these show that algorithms are not inherently neutral, perfect, or malevolent: they simply do what the humans and data that train them say to do.  In short, they are just as flawed as humans are.

6. Algorithms are ushering in a technological revolution

Algorithms may be imperfect, but they are nonetheless transforming our 21st century world. 

All those things that we take for granted like the web and social media, and on and on, they would not exist without algorithms.

As these automated sets of instructions become more and more widespread - from your dishwasher to government’s supercomputers - humans have the ability to exercise our knowledge more quickly and efficiently than ever before. 

Algorithms are doing for mental work what the Industrial Revolution did for manual work. Algorithms are the automation of intelligence.  And if you think about that, this is a very powerful thing: to do something that used to take human thinking and labour to do, can be done by an algorithm.

Algorithms are here to stay.  However, how we design them - biased or equitable, helpful or harmful - and how much we unquestionably accept their presence, is up to us to navigate the complex maze of IT Technology and the Cyber-space.