Posts

2024

Kubernetes and other container orchestration tools

  • 1 min read

Containers – and more specifically, containerized microservices or serverless functions – have become the de-facto compute standard of modern cloud-native applications. Containers are a method of packaging and isolating applications into standardized units that can be easily moved between environments. Unlike traditional virtual machines (VMs) which virtualize an entire operating system, containers only virtualize the application layer, making them more lightweight, portable and efficient. Containers work hand in hand with modern cloud native development practices by making applications more portable, efficient, and scalable.

Read More

The Evolution of Big Data

  • 1 min read

Although the concept of big data itself is relatively new, the origins of large data sets go back to the 1960s and ‘70s when the world of data was just getting started with the first data centers and the development of the relational database. n 1985, Bill Inmon first coined the term “data warehouse” as a “subject-oriented, nonvolatile, integrated, time-variant collection of data in support of management’s decisions.”

Read More

Back to Top ↑

2023

Asynchronous Messaging

  • ~1 min read

In the real world, asynchronous messaging is a communication method where participants on both sides of the conversation have the freedom to start, pause, and resume conversational messaging on their own terms, eliminating the need to wait for a direct live connection (aka synchronous messages).

Read More

Evolution of Data Serialization Formats

  • 1 min read

Computer programs need to read and write data, which is stored in databases or files, or transmitted as messages across the network. When data is exported from an in-memory representation into some sort of external media, it must be encoded. For simple data types such as plain text and numbers, encoding is straightforward. But it becomes complex when we need to deal with data structures such as objects, arrays (lists), trees and tables. Initially, data was structured in a proprietary manner per application. However, in order to allow interoperability, standard formats were required and this is where data serialization comes into play.

Read More

Evolution of Remote Procedure Calls into Web Services

  • 1 min read

As soon as computer networks emerged, software developers started building distributed applications. Initially these were simple client-server applications that communicated through proprietary application-level protocols over TCP or UDP (using socket connections). But soon came the need of using services that are provided by 3rd party applications through public application programming interfaces (APIs). This need led to the development of standard protocols for invoking remote APIs. These protocols have evolved into Web Services.

Read More

TCP/IP, HTTP and Other Protocols

  • 1 min read

A protocol is a set of rules and guidelines that dictate how communication should take place between two or more entities. In networking, a protocol is a set of rules for formatting and transmitting data over a computer network. Network protocols are like a common language for computers. The computers within a network may use vastly different software and hardware; however, the use of protocols enables them to communicate with each other regardless.

Read More

Back to Top ↑

2022

Relational vs. NoSQL Databases

  • 1 min read

Most applications need to store and retrieve data. While file systems enable basic data storage, they lack the ability to efficiently find, retrieve and update specific data items within files. The first Database was invented in 1961 at GE and included a data model, description language, and manipulation language (store, retrieve, modify, delete). In 1968 flat-file-based databases were introduced. Then the Hierarchical Database came into existence when IBM introduced its first database, IMS (Information Management System). In a hierarchical database, data is organized in a tree-like structure, with each parent node having multiple child nodes. This type of database was popular in the early days of computing, but it has since been largely replaced by more flexible and powerful database systems.

Read More

Cloud Computing as a Service

  • 1 min read

The cloud is a simple concept: one company rents out its hardware to other companies as software that is delivered over the internet. At the base level, cloud infrastructure virtualizes (represents hardware as software) building blocks of computing like compute power, storage, and networking.

Read More

The Internet

  • 1 min read

The Internet began as an academic research project in 1969, but its official birthday is considered January 1, 1983. On this day Advanced Research Projects Agency Network (ARPANET) and the Defense Data Network officially changed to the TCP/IP standard. By 1987, there were nearly 30,000 hosts on the Internet. In 1989 the proposal for the World Wide Web was written by Tim Berners-Lee. In 1990 the first commercial dial-up Internet service provider enabled Internet connection for home users. In 1991, the first web page was created.

Read More

Cloud Storage

  • 2 min read

Data storage refers to magnetic, optical or mechanical media that records and preserves digital information for ongoing or future operations. It is a fundamental component of computer systems and an essential enabler for executing computer programs. A computer’s CPU can’t actually compute anything or produce any output data without the user’s input. Users can enter the input data directly into a computer. It can work for small amounts of data, but it is not practical for larger volumes, including the actual program instructions.

Read More

Virtual Computing

  • 1 min read

In the past, physical servers functioned much like regular computers. The server was a physical machine on which the operating system was installed and the applications on top of it. These types of servers are often referred to as ‘bare metal servers’, as there’s nothing in between the actual physical (metal) machine and the operating system. Usually, these servers were dedicated to one specific purpose, such as running one designated system.

Read More

The Bare Metal Machine

  • 1 min read

As we start our journey to the Cloud, it would make sense to start with the lowest layer of computing, which is the computer hardware. Any software is useless without hardware. The cloud infrastructure is based on an enormous amount of physical computers, also known as servers or simply machines. Although you don’t own them and you can’t see them, they are somewhere out there on the ground (or maybe below the ground or even underwater), consuming a lot of energy and running your software. This is true even if you use so-called “serverless” services.

Read More

Back to Top ↑