2024
Load Balancer, Reverse Proxy and API Gateway
As the digital landscape expands and the number of users accessing software applications and systems over the Internet, at any given second multiplies, managing application traffic has become a critical aspect of ensuring optimal system performance.
Kubernetes and other container orchestration tools
Containers – and more specifically, containerized microservices or serverless functions – have become the de-facto compute standard of modern cloud-native applications. Containers are a method of packaging and isolating applications into standardized units that can be easily moved between environments. Unlike traditional virtual machines (VMs) which virtualize an entire operating system, containers only virtualize the application layer, making them more lightweight, portable and efficient. Containers work hand in hand with modern cloud native development practices by making applications more portable, efficient, and scalable.
The Evolution of Big Data
Although the concept of big data itself is relatively new, the origins of large data sets go back to the 1960s and ‘70s when the world of data was just getting started with the first data centers and the development of the relational database. n 1985, Bill Inmon first coined the term “data warehouse” as a “subject-oriented, nonvolatile, integrated, time-variant collection of data in support of management’s decisions.”
2023
Asynchronous Messaging
In the real world, asynchronous messaging is a communication method where participants on both sides of the conversation have the freedom to start, pause, and resume conversational messaging on their own terms, eliminating the need to wait for a direct live connection (aka synchronous messages).
Evolution of Data Serialization Formats
Computer programs need to read and write data, which is stored in databases or files, or transmitted as messages across the network. When data is exported from an in-memory representation into some sort of external media, it must be encoded. For simple data types such as plain text and numbers, encoding is straightforward. But it becomes complex when we need to deal with data structures such as objects, arrays (lists), trees and tables. Initially, data was structured in a proprietary manner per application. However, in order to allow interoperability, standard formats were required and this is where data serialization comes into play.
Evolution of Remote Procedure Calls into Web Services
As soon as computer networks emerged, software developers started building distributed applications. Initially these were simple client-server applications that communicated through proprietary application-level protocols over TCP or UDP (using socket connections). But soon came the need of using services that are provided by 3rd party applications through public application programming interfaces (APIs). This need led to the development of standard protocols for invoking remote APIs. These protocols have evolved into Web Services.
TCP/IP, HTTP and Other Protocols
A protocol is a set of rules and guidelines that dictate how communication should take place between two or more entities. In networking, a protocol is a set of rules for formatting and transmitting data over a computer network. Network protocols are like a common language for computers. The computers within a network may use vastly different software and hardware; however, the use of protocols enables them to communicate with each other regardless.
2022
Relational vs. NoSQL Databases
Most applications need to store and retrieve data. While file systems enable basic data storage, they lack the ability to efficiently find, retrieve and update specific data items within files. The first Database was invented in 1961 at GE and included a data model, description language, and manipulation language (store, retrieve, modify, delete). In 1968 flat-file-based databases were introduced. Then the Hierarchical Database came into existence when IBM introduced its first database, IMS (Information Management System). In a hierarchical database, data is organized in a tree-like structure, with each parent node having multiple child nodes. This type of database was popular in the early days of computing, but it has since been largely replaced by more flexible and powerful database systems.
Cloud Computing as a Service
The cloud is a simple concept: one company rents out its hardware to other companies as software that is delivered over the internet. At the base level, cloud infrastructure virtualizes (represents hardware as software) building blocks of computing like compute power, storage, and networking.
The Internet
The Internet began as an academic research project in 1969, but its official birthday is considered January 1, 1983. On this day Advanced Research Projects Agency Network (ARPANET) and the Defense Data Network officially changed to the TCP/IP standard. By 1987, there were nearly 30,000 hosts on the Internet. In 1989 the proposal for the World Wide Web was written by Tim Berners-Lee. In 1990 the first commercial dial-up Internet service provider enabled Internet connection for home users. In 1991, the first web page was created.
Cloud Storage
Data storage refers to magnetic, optical or mechanical media that records and preserves digital information for ongoing or future operations. It is a fundamental component of computer systems and an essential enabler for executing computer programs. A computer’s CPU can’t actually compute anything or produce any output data without the user’s input. Users can enter the input data directly into a computer. It can work for small amounts of data, but it is not practical for larger volumes, including the actual program instructions.
Virtual Computing
In the past, physical servers functioned much like regular computers. The server was a physical machine on which the operating system was installed and the applications on top of it. These types of servers are often referred to as ‘bare metal servers’, as there’s nothing in between the actual physical (metal) machine and the operating system. Usually, these servers were dedicated to one specific purpose, such as running one designated system.
Brief History of Operating Systems
In my previous post, I talked about computer hardware as the lowest level of computing. Moving upwards through the technology stack, it is hard to imagine a modern computer or a computing device without an Operating System (OS) that manages its hardware and software. But it wasn’t always like that…
The Bare Metal Machine
As we start our journey to the Cloud, it would make sense to start with the lowest layer of computing, which is the computer hardware. Any software is useless without hardware. The cloud infrastructure is based on an enormous amount of physical computers, also known as servers or simply machines. Although you don’t own them and you can’t see them, they are somewhere out there on the ground (or maybe below the ground or even underwater), consuming a lot of energy and running your software. This is true even if you use so-called “serverless” services.
Welcome to Stairway to the Cloud
Hello and welcome to my new website, titled: “Stairway to the Cloud”.