What is IT?
Information Technology or IT has transformed lives in a way that may not even realize. Thanks to IT, we can communicate massive amounts of data to people and organizations across the world in the blink of an eye. Computers power everything from calculators to medical equipment, to complex satellite systems to the trading desk of Wall Street. They are so powerful and invaluable tools that enable people to get their work done and and connect with one another.
IT is the use of digital technology, like computers and the internet to store and process data into useful information.
The IT industry refers to the entire scope of all the jobs and resources that are related to computing technologies. IT is about solving problems in the digital world in the fields of education, medicine, journalism, construction, transportation, entertainment, and any other industry.
There is a growing skills gap between people who are accustomed to IT and the ones who have moderate to less exposure to it. This gap is called the Digital Divide. It is with the help of the IT personnel, the gap is managed and bridged. By getting into IT, one can aid and even inspire others to help bridge this digital divide.
Role of an IT Support Specialist
In general, IT support personnel make sure that an organization's technological infrastructure is running smoothly. This includes managing, installing, maintaining, troubleshooting, and configuring office and computing equipment.
Course Introduction
Computers are everywhere, and by taking this course, it will be possible for the learner to know the working of a computer and get a grasp on the building blocks of IT, basics of how computer hardware performs calculations, the way Operating Systems control and interact with hardware, internet and the understanding of how computers talk to each other, the way applications and programs tie all of this together and allow humans to interact with them. Finally, problem-solving with computer and essential communication skills in IT is dealt with.
History of Computing - From Abacus to Artificial Intelligence
A computer is a device that stores and processes data by performing calculations. Historically, the term computer used to refer to someone who actually did the calculations. This course deals with how these calculations are used and baked into applications such as social media or games.
The following section discusses the history of computers and the journey they took to be that they are today. It is always a good idea to learn about the past (applies to science and philosophy) in order to appreciate the journey and to be inspired about future possibilities. It also gives a sense of why things are the way they are today.
500 BC: In China, the abacus was invented. It was popularized during the Ming dynasty (1368 AD to 1644 AD) and has been a very popular calculating device ever since. The abacus is still in use today.
1642: In France, Blaise Pascal invented the mechanical calculator, called the Arithmetic Machine. It could only do Addition and Subtraction with numbers entered by manipulating its dials
1801: In France, Joseph Marie Jacquard invents a loom that uses punched wooden cards to automatically weave fabric designs. These cards were known as punch cards and early computers would use similar punch cards.
1822: English mathematician Charles Babbage conceives of a steam-driven calculating machine that would be able to compute tables of numbers. He first designed and built the Difference Engine and then designed the Analytical Engine, which is considered as the first computer. The analytical engine lacked funding and was never built.
1842: English Mathematician Ada Lovelace writes the world's first program, an algorithm for the theoretical Analytical Engine. She is considered as the world's first programmer.
An Algorithm is a series of steps that solve a specific set of problems.
1890: Herman Hollerith designs a punch card system to calculate the 1880 census, accomplishing the task in just three years and saving the government $5 million. He establishes a company that would ultimately become IBM.
Back in the day, computing was expensive. The electronic components powering the computers were large and expensive and took up a lot of space and money to compute anything of value. When the war broke out (WWII), Governments started pouring money into computer research and this led to major breakthroughs and advancements in the field of computing.
1936: Alan Turing presents the notion of a universal machine, later called the Turing machine, capable of computing anything that is computable. The central concept of the modern computer was based on his ideas.
1937: J.V. Atanasoff, a professor of physics and mathematics at Iowa State University, attempts to build the first computer without gears, cams, belts, or shafts.
1939: Hewlett-Packard is founded by David Packard and Bill Hewlett in a Palo Alto, California, garage, according to the Computer History Museum.
**1941**: Atanasoff and his graduate student, Clifford Berry, design a computer that can solve 29 equations simultaneously. This marks the first time a computer is able to store information on its main memory.
1943-1944: Two University of Pennsylvania professors, John Mauchly and J. Presper Eckert build the Electronic Numerical Integrator and Calculator (ENIAC**)**. Considered the grandfather of digital computers, it fills a 20-foot by 40-foot room and has 18,000 vacuum tubes.
1946: Mauchly and Presper leave the University of Pennsylvania and receive funding from the Census Bureau to build the Universal Automatic Computer (UNIVAC), the first commercial computer for business and government applications.
1947: William Shockley, John Bardeen, and Walter Brattain of Bell Laboratories invented the transistor. They discovered how to make an electric switch with solid materials and no need for a vacuum.
1951: Magnetic tape was first used to record computer data in 1951 on the Eckert-Mauchly UNIVAC-I. The system's UNISERVO I tape drive used a thin strip of one half inch (12.65 mm) wide metal, consisting of nickel-plated bronze (called Vicalloy).
**1953**: Grace Hopper develops the first computer language, which eventually becomes known as COBOL. Thomas Johnson Watson Jr., son of IBM CEO Thomas Johnson Watson Sr., conceives the IBM 701 EDPM to help the United Nations keep tabs on Korea during the war.
1954: The TRADIC (for TRAnsistor DIgital Computer or TRansistorized Airborne DIgital Computer) was the first transistorized computer in the USA, completed in 1954
**1954**: The FORTRAN programming language, an acronym for FORmula TRANslation, is developed by a team of programmers at IBM led by John Backus, according to the University of Michigan.
1956: IBM developed and shipped the first commercial Hard Disk Drive (HDD), the Model 350 disk storage unit, to Zellerbach Paper, San Francisco in June 1956 as part of the IBM 305 RAMAC (Random Access Method of Accounting and Control) system.
1957: Grace Hopper writes the first compiler for the A-0 language in 1952. The FORTRAN team led by John Backus at IBM is generally credited as having introduced the first complete compiler.
1958: Jack Kilby and Robert Noyce unveil the integrated circuit, known as the computer chip. Kilby was awarded the Nobel Prize in Physics in 2000 for his work.
1964: Douglas Engelbart shows a prototype of the modern computer, with a mouse and a graphical user interface (GUI**)**. This marks the evolution of the computer from a specialized machine for scientists and mathematicians to technology that is more accessible to the general public.
1969: A group of developers at Bell Labs produce UNIX, an operating system that addressed compatibility issues. Written in the C programming language, UNIX was portable across multiple platforms and became the operating system of choice among mainframes at large companies and government entities. Due to the slow nature of the system, it never quite gained traction among home PC users.
1970: The newly formed Intel unveils the Intel 1103, the first Dynamic Access Memory (DRAM) chip.
1971: Intel releases the first true microprocessor, the Intel 4004 as a single MOS LSI (Large-Scale Integration) chip
1971: Alan Shugart leads a team of IBM engineers who invent the "floppy disk," allowing data to be shared among computers.
1972: Atari develops coin operated games, called arcade games
1973: Robert Metcalfe, a member of the research staff for Xerox, develops Ethernet for connecting multiple computers and other hardware.
1973: XEROX PARC releases the Xerox Alto which was the first computer to support an operationg system based ona Graphical User Interface (GUI).
1974-1977: A number of personal computers hit the market, including Scelbi & Mark-8 Altair, IBM 5100, Radio Shack's TRS-80 — affectionately known as the "Trash 80" — and the Commodore PET.
1975: The January issue of Popular Electronics magazine features the Altair 8080, described as the "world's first minicomputer kit to rival commercial models." Two "computer geeks," Paul Allen and Bill Gates, offer to write software for the Altair, using the new BASIC language. On April 4, after the success of this first endeavor, the two childhood friends form their own software company, Microsoft.
1976: Steve Jobs and Steve Wozniak start Apple Computers on April Fool's Day and roll out the Apple I, the first computer with a single-circuit board, according to Stanford University.
1977: Radio Shack's initial production run of the TRS-80 was just 3,000. It sold like crazy. For the first time, non-geeks could write programs and make a computer do what they wished.
1977: Jobs and Wozniak incorporate Apple and show the Apple II at the first West Coast Computer Faire. It offers color graphics and incorporates an audio cassette drive for storage.
1978: Accountants rejoice at the introduction of VisiCalc, the first computerized spreadsheet program.
1979: Word processing becomes a reality as MicroPro International releases WordStar. "The defining change was to add margins and word wrap," said creator Rob Barnaby in email to Mike Petrie in 2000. "Additional changes included getting rid of command mode and adding a print function. I was the technical brains — I figured out how to do it, and did it, and documented it. "
The first IBM personal computer, introduced on Aug. 12, 1981, used the MS-DOS operating system. (Image credit: IBM)
1981: The first IBM personal computer, code-named "Acorn," is introduced. It uses Microsoft's MS-DOS operating system. It has an Intel chip, two floppy disks, and an optional color monitor. Sears & Roebuck and Computerland sell the machines, marking the first time a computer is available through outside distributors. It also popularizes the term PC.
1983: Richard Stallman releases Project GNU, a Unix-like Operation System for free and Open-Source which meant that anyone can modify and use it.
1983: Apple's Lisa is the first personal computer with a GUI. It also features a drop-down menu and icons. It flops but eventually evolves into the Macintosh. The Gavilan SC is the first portable computer with the familiar flip form factor and the first to be marketed as a "laptop."
1985: Microsoft announces Windows. This was the company's response to Apple's GUI. Commodore unveils the Amiga 1000, which features advanced audio and video capabilities.
1985: The first dot-com domain name is registered on March 15, years before the World Wide Web would mark the formal beginning of Internet history. The Symbolics Computer Company, a small Massachusetts computer manufacturer, registers Symbolics.com. More than two years later, only 100 dot-coms had been registered.
1986: Compaq brings the Deskpro 386 to market. Its 32-bit architecture provides as speed comparable to mainframes.
1990: Tim Berners-Lee, a researcher at CERN, the high-energy physics laboratory in Geneva, develops HyperText Markup Language (HTML), giving rise to the World Wide Web.
1991: Linus Torvalds, as a student of the University of Helsinki developed a Unix based Operating System. in 1994, the first Linux Kernel, Version 1.0 was released.
1993: The Pentium microprocessor advances the use of graphics and music on PCs.
1994: PCs become gaming machines as "Command & Conquer," "Alone in the Dark 2," "Theme Park," "Magic Carpet," "Descent" and "Little Big Adventure" are among the games to hit the market.
1996: Sergey Brin and Larry Page develop the Google search engine at Stanford University.
1997: Microsoft invests $150 million in Apple, which was struggling at the time, ending Apple's court case against Microsoft in which it alleged that Microsoft copied the "look and feel" of its operating system.
1999: The term Wi-Fi becomes part of the computing language and users begin Connecting to the Internet without wires.
2001: Apple unveils the Mac OS X operating system, which provides protected memory architecture and pre-emptive multi-tasking, among other benefits. Not to be outdone, Microsoft rolls out Windows XP, which has a significantly redesigned GUI.
2003: The first 64-bit processor, AMD's Athlon 64, becomes available to the consumer market.
2004: Mozilla's Firefox 1.0 challenges Microsoft's Internet Explorer, the dominant Web browser. Facebook, a social networking site, launches.
2005: YouTube, a video sharing service, is founded. Google acquires Android, a Linux-based mobile phone operating system.
2006: Apple introduces the MacBook Pro, its first Intel-based, dual-core mobile computer, as well as an Intel-based iMac. Nintendo's Wii game console hits the market.
2007: The iPhone brings many computer functions to the smartphone.
2009: Microsoft launches Windows 7, which offers the ability to pin applications to the taskbar and advances in touch and handwriting recognition, among other features.
2010: Apple unveils the iPad, changing the way consumers view media and jump-starting the dormant tablet computer segment.
2011: Google releases the Chromebook, a laptop that runs the Google Chrome OS.
2012: Facebook gains 1 billion users on October 4.
2015: Apple releases the Apple Watch. Microsoft releases Windows 10.
2016: The first reprogrammable quantum computer was created. "Until now, there hasn't been any quantum-computing platform that had the capability to program new algorithms into their system. They're usually each tailored to attack a particular algorithm," said study lead author Shantanu Debnath, a quantum physicist and optical engineer at the University of Maryland, College Park.
2017: The Defense Advanced Research Projects Agency (DARPA) is developing a new "Molecular Informatics" program that uses molecules as computers. "Chemistry offers a rich set of properties that we may be able to harness for rapid, scalable information storage and processing," Anne Fischer, program manager in DARPA's Defense Sciences Office, said in a statement. "Millions of molecules exist, and each molecule has a unique three-dimensional atomic structure as well as variables such as shape, size, or even color. This richness provides a vast design space for exploring novel and multi-value ways to encode and process data beyond the 0s and 1s of current logic-based, digital architectures." [[https://www.livescience.com/58553-darpa-creates-molecular-computer-initiative.html|[Computers of the Future May Be Minuscule Molecular Machines]]]
Computer Language
At its core, a computer compares 0s and 1s a million or billion times a second. The communication that a computer uses is referred to as a Binary System or Base-2 Numeral System. Instead of the alphabets that human language uses, a computer uses just 0s and 1s to communicate.
Binary is grouped into 8 numbers called bits or Binary Digit. 8 bits equal to a Byte. Each byte can store one character, thus giving 256 possible characters, as $2^8=256$. This byte can refer to anything on a computer, a video, text, games, everything is made up of bytes of information.
Character Encoding
Thus, taking into consideration that there are 256 possible characters, character encoding makes it possible to display all characters, emojis, and other language symbols. It is used to assign binary values to characters so that humans can read them. Character encoding is kind of like a dictionary for the computer to look up and decide which human-readable character to show to the user.
The oldest character encoding standard used is ASCII for American Standard Code for Information Interchange. It represents the English alphabet characters, numbers, and punctuation marks. ASCII only used 127 characters out of the 256 possible values and lasted for a very long time, but eventually wasn't enough to support all languages and their symbols. Check out the ASCII table here, 🔗ASCII Values
ASCII to Binary, Octal, Decimal and Hexadecimal Conversion Chart
Conversion chart for the ASCII values
To rectify the limitations of the ASCII encoding, came the UTF-8 which is the most prevalent encoding standard used today. It has the same values as from the ASCII as well as support for multiple languages and emojis. This is accomplished by using a variable number of bytes. UTF-8 is based on the Unicode Standard which helps to represent characters in a consistent manner.
In computers, RGB color model is used and three characters are used to represent a color. Each character represents the shade of the color which then changes the color of the pixel on the screen. Thus with just 8 combinations of 0s and 1s, everything visual can be represented.
Binary
The 0s and 1s are nothing but the state of ON or OFF of a circuit. In correspondence with Jacquard's Loom, a 0 would be a NO HOLE and 1 would be a HOLE. If the loom would reach a hole it would thread underneath and if there is no hole, it would not. Using this, complex patterns were weaved. This is foundational Binary concept, with ONs and OFFs. Binary in today's computers are done by electricity. If there is voltage, it is ON, and if there isn't one, it is OFF. This is done via transistors.
Logic gates allow the transistors to do more complex tasks as to where to send electrical signals depending on logical conditions. There are a lot of different types of Logic Gates. To read more on Logic Gates, click here 🔗Logic Gates
Counting in Binary
Binary is the fundamental communication block of a computer. It is how the computers count, but humans don't count like that. Children would be taught to count by fingers (hand). Counting by hand is called as the Decimal System or the Base-10 System. This system uses 10 possible numbers ranging from 0 - 9. In a binary system, there are only two possible numbers, namely 0s and 1s. These are converted into a system that can be understandable by humans, which is decimal. It is possible to represent any number using bits (binary digits) and invariably represented as a decimal.

This is the basics of how the computer converts the binary code to human-readable characters using a decimal value. the character encoding systems allow the computer to represent the appropriate values for their binary equivalents.
Abstraction
Humans interact with a computer using a mouse, keyboard, and/or touch screens and do not directly tell the exact 1s and 0s to the machine in order for it to perform calculations. This is done behind the scenes by the concept called Abstraction*, which is where a relatively complex system is simplified for end-user.
*The process of removing physical, spatial, or temporal details or attributes in the study of objects or systems to focus attention on details of greater importance; it is similar in nature to the process of generalization
For example, in case of driving a car, there is no need for the driver to know how to operate the engine and transmission directly, instead it is enough to know the pedals and steering wheels and their working. No matter if the car was made from Volkswagen or BMW or Honda, these elements remain fairly the same. This is abstraction in action.
Abstraction hides complexity by providing a common interface. In case of a computer, it is not required to know the details of how the computer is operated under the hood. There is mouse, keyboard and/or a touch screen which is all that is required to operate and interact the computer.
Yet another example of abstraction, particularly in the IT world is the concept of ERROR MESSAGES. A simple error message contains the necessary information that would help to get a lead in troubleshooting the issue, rather than requiring to manually surf through the system to find the problem. This saves a lot of time and puts the person who is troubleshooting the issue, closer to the solution.
Computer Architecture Overview
As an IT support personnel to know and understand the layers of the computer and how they work in unison. This would be tremendously helpful in resolving any issues that arise.
Computer can be cut into four main layers, and they are

