If the words Java and Python only make you think of a hot cup of coffee and a slithery creature, it’s time to expand your vocabulary. STEM is becoming increasingly popular, and people of all ages are taking an interest in learning the basics of coding.
In honour of National Coding Week, an annual event which shines a spotlight on developing digital skills, we’ve mined the internet to compile a list of key terms that will improve your knowledge and kickstart your coding journey. Are you ready?
Perhaps one of the most important pieces of terminology for those interested in technology, an algorithm is a set of rules or processes that are followed to solve a problem or execute a function. Algorithms are the basis for most computer programs and are designed to make our everyday lives easier. You’re probably more familiar with algorithms than you think.
For example, when you go to Google and search for the daily news, how do you think Google decides which news articles should appear at the top of your results? When you open up TikTok, how does it decide which video you should see first? It’s all about algorithms, in this case, machine-learning algorithms that ‘learn’ your browsing behaviour. Algorithms are important, which is why many social networks treat them as proprietary, meaning they belong to the social network in question and their inner workings aren’t necessarily apparent to the end user – that’s me and you.
Many years ago, before the dawn of Facebook and YouTube, there was Babbage. Charles Babbage was an English mathematician, philosopher, inventor and mechanical engineer who lived in the 19thcentury and is often referred to as ‘the father of the computer’. He is credited with originating the concept of the digital programmable computer when he invented the Difference Engine in the 1820s.
The Difference Engine, parts of which you can see at the London Science Museum, is an automatic mechanical calculator which is capable of tabulating polynomial functions. Over 100 years after Babbage’s death, computer scientists were able to build the first functioning difference engine based on Babbage’s original plans.
Both a skittery little creature that appears on your wall acting like it owns the place and an important computer-related term, a bug is a flaw in computer software or hardware that can cause an error or make it behave in a different way than intended. Bugs can crop up anywhere, from when Excel crashes unexpectedly to when you can’t get your camera to work properly on Zoom.
Related to bug, there is a process called debugging, which is the act of finding and correcting such mistakes. You might notice when you update an app on your smartphone that the developers may put a list of corrected bugs in the update notes. Fun fact: the first computer bug to be identified was an actual bug! In September 1947 Dr Grace Hopper, a computer scientist at Harvard University, and her colleagues noticed one of their computers was constantly making mistakes. When they opened it up, they found a dead moth inside.
Now we’ve acquainted ourselves with some basic ideas, it’s time to look at some technical vocabulary related to the languages of computing. CSS is an acronym for Cascading Style Sheets, a language which describes the formatting of markup landing pages. Most commonly associated with building websites, CSS helps web developers tell their computer how a webpage should look, from colours and background images to the fonts that should appear.
Think of it this way, if HTML (which we’ll talk about in a moment) represents the structure of a building, CSS could be likened to the paint, flooring and wallpaper. CSS is considered one of the cornerstone technologies of the internet and allows developers to separate the content and visuals of a webpage.
Java is a powerful multi-platform programming language, which sits alongside HTML and CSS as one of the most widely used languages on the internet. Java has been used as the main language for many popular professional and commercial pieces of software, including every Android app and even the Android smartphone operating system.
First developed in 1995, Java’s flexibility and ability to create high-performance applications has made it extremely popular for developers. In fact, the sandbox videogame Minecraft was developed in Java way back in 2011 and has gone on to sell over 238 million copies worldwide, making it the most-sold video game of all time.
The second-most used programming language in the world, HTML is an abbreviation for HyperText Markup Language. It is the standard mark-up language used to create webpages across the internet and if you’re just starting out with coding, it’s usually the first programming language you’ll be taught.
That’s because HTML builds the structure of the webpage or program you’re trying to create, allowing you to describe the layout of the page – for example where headings, text, tables and images might go – before you apply any style or design using CSS. First released almost 30 years ago, HTML is a standardised programming language that has seen many changes over the decades as the World Wide Web has developed and grown.
If you’ve ever watched The Terminator, you’d be forgiven for thinking that machine learning and artificial intelligence are interchangeable and maybe just a little bit frightening. Machine learning is a branch of artificial intelligence in which a computer generates rules either underlying or based on raw data that has been fed into it by a person. While artificial intelligence is the concept of machines being able to execute independent thought to complete tasks, machine learning relies on manual input.
First uses of machine learning involved creating a scanner for the US Postal Service, which could process mail faster by reading the address on letters using handwriting samples. Other applications of machine learning in everyday life include your navigation app being able to estimate how long it will take for you to get to work today by train, or Spotify suggesting a playlist based on music you’ve listened to recently.
Much like human languages, programming languages have their own set of rules on how statements can be conveyed. These rules are known as syntax. While many of the programming languages available for developers share features, functions and capabilities, they often differ in syntax. Much like while there is a word for egg in English, French, Spanish and Japanese, we all have different ways of expressing it and different grammatical rules to say it properly in the right context.
Without the proper programming syntax, it’s almost impossible for developers to write an executable program. If you fail to get the syntax right, your program or webpage is much more likely to crash or present an error. Examples of syntax error in code could be incorrectly spelled statements, incorrectly spelled variables, or missing punctuation that prevents your code from running properly.
A variable in programming language is a container that can hold a single number, word, or any other information that you want to use in your program. Consider it like a treasure chest full of code that you want to fill with different valuables, and name in a way that makes them accessible and easy to find when you need them. All variables have three key parts: type, name, and value.
One of the most important things to remember about variables is that you want to give them clear, coherent names. Their sole purpose is to label and store data in your code that can be used throughout your program. A variable can be modified as many times as you need, depending on the data that is being fed into your program, be it an app, game, software or website.
By Rachel Quin
Rachel Quin is a freelance marketer and copywriter with a love of language, books and cats.
All opinions expressed on this blog are those of the individual writers, and do not necessarily reflect the opinions or policies of Collins, or its parent company, HarperCollins.