Introduction to Blockchain – Architectural Overview (Part 3)

In today’s post, we will look into the architectural details of the blockchain technology. In doing so, we will touch upon the concepts of Smart Contracts and later on, about the internal details of a block that comprises the Bitcoin blockchain.

Interesting project but no money ☹️

Let us say your friend Alice has an interesting project idea and she wants to start it as soon as possible but she has no (or little) funding to start it. She is very upset and then as a good friend you come to her rescue (A friend in need is a friend indeed 😊). You suggest her that she should crowdfund the project. She is now intrigued and inquires about it.

What is crowdfunding and how it works?

Put simply, crowdfunding is the process of getting a large crowd of people to pay a small amount of money to invest in an idea or product, with the promise that when the said idea comes to fruition, they will get first access to it. The idea can be anything, from an innovative piece of tech to a video game to a novelty gift.

  • Both the product team and the supporters need to trust the crowdfunding platform (or a Kickstarter).
  • The product team expects the money to be paid as the project progresses.
  • The supporters expect the money to go to the project.

As it is clear that this is a centralized system and the major drawback is that some monetary benefit goes to the Kickstarter for their efforts in managing the entire process. The solution to this problem leads us to the concept of “Smart Contracts” which is a decentralized version of the same platform.

Smart Contracts

In 1994, Nick Szabo realized that a decentralized system could be used for smart contracts or self-executing contracts. Here, contracts are converted into computer code, stored and replicated on the system and supervised by the network of computers that run the blockchain.

Thus, smart contracts help us exchange money, property, shares, or anything of value in a transparent, conflict-free way while avoiding the services of a middleman.


The best way to describe smart contracts is to compare the technology to a vending machine. Ordinarily, you would go to a lawyer or a notary, pay them, and wait while you get the document. With smart contracts, you simply drop a bitcoin into the vending machine (i.e. ledger), and your escrow, driver’s license, or whatever drops into your account. More so, smart contracts not only define the rules and penalties around an agreement in the same way that a traditional contract does, but also automatically enforce those obligations.

if we apply Smart Contracts to our example of crowdfunding, then the following things will happen –

  1. The contract is written in a “code” which is available to the stakeholders – product team (Alice) and the supporters (Note – a typical example of a blockchain)
  2. The code automatically transfers money as certain goals (contracts) of the project are accomplished.
  3. If the project goals (or contracts) fail, then the code will transfer money back to the supporters.

Advantages of Smart Contracts

  • Immutable – No party will be able to change the contract once it is written to the public ledger.
  • Distributed – Every stakeholder can validate the contract


Let us look into these components one by one –

Block Header

The header contains metadata about a block. There are three different sets of metadata:

  • The previous block hash – in a blockchain, every block inherits from the previous block as it uses the previous block’s hash to create the new block’s hash. For every block N, the hash of the block N-1 is used.
  • Mining competition – each block in a blockchain has the timestamp, the nonce and the difficulty. It should be complicated enough to make the block tamper-proof. The nonce is a random variable that makes it possible for the hash of the Nth block with predefined complexity. For e.g., for bitcoin, this complexity is that there should be 20 zeros at the prefix.
  • Merkle tree root – data structure that summarizes the transactions in the block.The root is the verification of all the transactions. I discussed Merkle algorithm in detail in my first post here.


If you look at the above figure closely and notice the word “over”, you will find that the hash function changes drastically. This is called Avalanche Effect. This is very important from the security aspect as if the message is encoded, it makes it very hard for someone to guess the message.


Transactions are organized as a Merkle Tree whose root is used to construct the block hash. A Merkle tree is constructed by recursively hashing pairs of nodes ( in this case, transactions ), until there is only one hash, called the root or Merkle root. If we stay in the Bitcoin world, the cryptographic hash algorithm used is SHA256. This is applied twice each time.

This link discusses the Merkle tree in more detail.

This provides us the capability to determine almost accurately if someone tampers a block. If some malicious attacker tries to tamper one transaction, then the hash of that transaction will be changed and thus there will be changes in the hashes of the subsequent blocks. Thus complexity of determing the hash of a block is so tough that it would take a lot of computing power and time to corrupt the subsequent blocks before a new block is generated. Thus, it is almost impossible for a attacker to reach the end of the blocks and making all of them corrupt.


Blockchain Replicas

  • Every peer in a blockchain network maintains a local copy of the blockchain.
  • All the replicas need to be updated with the latest mined block.
  • All the replicas need to be consistent i.e. all the copies of the blockchain with peers must be exactly similar.

Distributed Consensus

Distributed consensus, also known as Nakamoto consensus, is a method of establishing canonical state in a system. Think of it this way: There are an infinite amount of possible wrong answers, but only one right answer. In typical systems, this is done by a central authority that determines validity and provides data, also known as a trusted third party (TTP).

The problem of infinite wrong answers means there must be a way to find validity in blocks. Instead of using a central authority, distributed nodes receive new blocks, and either choose to withhold or propagate them. In such a system, there is assumed to be an “uncoordinated, honest majority” that prevents a few malicious nodes from propagating incorrect blocks. This is also called Byzantine Fault Tolerance (BFT).

Using the consensus rules of Bitcoin, for example, transactions must have valid signatures, originating from spends that are available (there must be enough balance to spend), be put into a Merkle tree with SHA-256d, and have a Hashcash-style proof of work attached that is under the widely agreed upon network difficulty, and other rules. If there are multiple blocks that are valid but branch off at some point, pick the longest series of blocks. Ultimately, because there is such an overwhelming majority that follows Bitcoin consensus rules exactly as they are written, consensus can be established through the Internet without any real problems.

Few key points related to distributed consensus are –

  1. Ensure that all the nodes in the network see the same data at nearly the same point of time.
  2. All nodes need to agree on a regualr basis, that the data stored by them are same.
  3. No single point of failure – as the data is decentralized
  4. Challenge-Response Protocol – In this, the network poses a challenge to the peers. The node thay solves the challenge first declares that it has solved the challenge and the transaction is veirfied.
  5. Good challenge is the one in which different nodes win in different runs of the challenge. This ensures that no single node controls the network.
  6. Proof of Work – this ensures consensus over a permissions-less setting based on challenge-response.


Economics of Blockchain consensus

The challenge-response requires that every node spends large amount of computation power to solve a mathematical challenge in each iteration of the consensus. But why should they do it? What is the incentive for them? The answer to these questions is that when a node expends computational power to solve the problem, they are paid by the cryptocurrency generated and managed by the network.


With this, we have reached the end of this post. In this post, we discussed the architecture of a blockchain on a high level along with a good application – Smart Contracts.

In the next post, we will further explore the concepts of blockchain. Stay tuned 😊


Introduction to Blockchain – Evolution (Part 2)

In the last part of this series, we discussed the basic concepts of blockchain technology. In this post, we are going to discuss the evolution of blockchain and it the latter part of the post, I will introduce the concept of bitcoin technology at a high level.

Hashing and Hash Functions
The concept of hashing is the backbone of the blockchain technology and it helps us to speed up the searching process. Hashing allows us to map any data to a fixed size. The entity which performs hashing is called a Hash Function. For e.g, H(x) = x%n (% is the modulo operator – returns remainder when x is divided by n), as it is evident that x can be of any value but H(x) will always be in the range [0, n-1].
x is called Message
H(x) is called Message Digest

Cryptographic Hash Function
It is a mathematical algorithm that maps data of arbitrary size to a bit string of a fixed size (a hash). It is a one-way function i.e. we can find H(x) if x and n are given but if H(x) and n are given, we cannot find x. The only way to find x is by using brute-force and this gets tougher as the number of possibilities increase.

The ideal cryptographic hash function has five main properties:
The same message should always result in the same hash.
Significantly fast to calculate the hash of any message.
Only one way
A significantly different hash is generated even if there is a very small change in message.
No two different messages should generate the same hash.
Avalanche Effect


If you look at the above figure closely and notice the word “over”, you will find that the hash function changes drastically. This is called Avalanche Effect. This is very important from the security aspect as if the message is encoded, it makes it very hard for someone to guess the message.

Merkle Trees
Merkle Trees (or hash trees) are a fundamental part of the blockchain technology. It efficiently verifies the consistency of a large content of data. Bitcoin and Ethereum both use Merkle Tree concept.


They follow the bottom-up approach by hashing pair of nodes repeatedly until only one node/hash is left. This hash is called Root Hash or Merkle Root.

Each leaf node represents the hash of the transactional data while each non-leaf node represents the hash of the combination of hashes of the previous nodes. Merkle trees are binary trees and therefore require an even number of leaf nodes. If the number of transactions is odd, the last hash will be duplicated once to create an even number of leaf nodes.

Later in 1991, Stuart Haber and W. Scott Stornetta published a paper called How to Time-Stamp a Digital Document in which they used Merkle Trees to timestamp a digital document so that the people who edited the documents and at specific timestamps can be easily determined. In this scheme, each edit depends on the previous edit’s hash. Hence, if anyone tries to tamper the timestamp, the hash would change and the person can be caught easily. Thus, security increases many folds.

What is Bitcoin?
In 2009, Satoshi Nakamoto (the fact no one knows who is/are he/she/they – intrigues me 😵) published a whitepaper and introduced the concept of Bitcoin. In this section, we will discuss high-level details of Bitcoin and its working.

The bitcoin is a completely decentralized, peer-to-peer, permissionless electronic cash system. The keywords in above definition are:
Decentralized – No central party for ordering or recording anything, not even government.
Peer-to-peer – The software runs on the machines of all stakeholders to form the system.
Permissionless – No identity; no need to signup anywhere to use; no access control – anyone can participate in any role – be it, sender, receiver or miner.

A sample transaction in Bitcoin


In bitcoin, each individual has a copy of the most updated blockchain or public ledger. Let us try to understand the life cycle of a typical bitcoin transaction –

The Sender –
Sender opens his/her account
Provides the address of the receiver and specify the amount to transfer
The Network
The wallet constructs a transaction and signs it using the sender’s private key (to ensure the validity and authenticity of the transaction)
Once the transaction is constructed, it is broadcast to the network.
The network nodes validate the transaction based on the existing blockchain and propagate it to the miners.
The miners include the transaction to the next block to be mined.
The Miners
Miners collect all transactions within a fixed time interval (10 minutes for BTC)
Then they construct a new block and try to connect it with the existing blockchain through a cryptographic hash function
Once the block is generated, it is included in the blockchain and the updated blockchain is broadcast to the network.
The hash of each block is generated based on the hash of the previous block and the task of each miner is to solve this hash problem in a difficult manner (under the hood, the miners need to solve a hard mathematical problem – such as the hash should contain 20 trailing zeros – so each combination needs to be checked until the condition is met).
The Receiver
The receiver opens his/her bitcoin wallet and refreshes, the blockchain gets updated.
The transaction gets reflected in the receiver’s account.
Blockchain 2.0
Many mainstream companies are exploring to use the blockchain for building alternate systems other than transactions for uses in manufacturing, supply chain, governance, IoT etc. This revolution is being termed as Blockchain 2.0.

Phew! we have reached the end of this post and we discussed the evolution of the blockchain and the principles it works. Later on, we discussed high-level details of a typical bitcoin transaction. In the next post of this series, we will look into the blockchain architecture. I hope you enjoyed this post. Stay tuned! 😎

Introduction to Blockchain – Basics (Part 1)

The blockchain isn’t just the backbone of cryptocurrency – it could change the world. Applications are endless. It is one of the most sought-after skills these days. Blockchain developers are in huge demand but there are only a few who can elegantly develop a blockchain application.

This is the first blog of this series and without further ado let’s learn together the awesome concepts of Blockchain technology.

What is a blockchain?
It is a decentralized platform which enables multiple domains, who do not trust each other to collaborate in a decision-making process.

Main keywords from the above definition are –
Decentralized – There is no single centralized authority that makes decisions on behalf of all the concerned parties. Instead, each party, also called a peer, makes local autonomous decisions towards its individual goals.
Domains – Parties who are skeptical about the authenticity of other parties.
Decision-Making Process – A common process executed by all the peers such as validating a transaction.

Types of systems
There are broadly three types of systems – centralized, distributed and decentralized. Below is the most widely used diagram that depicts these three.

Centralized Systems
It is kind of multiple client-single server like architecture where the server computer is the one where all of the major processing is done.

Client machines connect to the server computer (master computer) and submit their requests. For e.g. a web application server that hosts all the business logic, runs the database etc. Various client machines connect to the web application server and send/receive requests/responses through the HTTP application layer protocol.

This mechanism poses the following problems –
Single point of failure – If the master goes down, clients won’t be able to serve user request since the machine that runs core logic is dead.
Limited scalability – Only vertical scaling is possible i.e. to add more storage, I/O bandwidth, CPU processing power for the master. Surely, there will be a limit after which increasing these parameters is not feasible.

Distributed Systems
A distributed system is a model in which components located on networked computers communicate and coordinate their actions by passing messages.

Thus, it is a system where components might span geographical boundaries but are owned and controlled by a single entity (cloud computing). Trust in such a system is still centralized. For e.g. Google.

Decentralized Systems
Here all the processing is not done by a single machine instead they allow to distribute work among various nodes and each of them can serve requests equally. For e.g. Cassandra data store.

There isn’t really any single point of failure because client machines aren’t relying on a single server to fulfill all requests. The system comprises multiple nodes which might be still available to process user requests.
We can scale-out the system by adding more nodes – Horizontal Scaling.

Working of Blockchain
In blockchain, everyone works on their local copy of a common document and it is the responsibility of the underlying process to ensure the consistency. Let us understand this with an example –

  • Suppose, there are four friends Alice, Bob, Carol, and Dwayne.
  • There is one public ledger which holds records of bank accounts of these four friends.
  • Let’s say one entry in that public ledger shows Alice has $100 in her account.
    Since each of the above four persons has a separate copy of the public ledger, each one now knows that Alice has $100 in her account.
  • Now, suppose Alice transfers $40 to Bob. This is a transaction and it will be updated as a new entry in the public ledger copy of Alice (after successful validation).
    Due to the underlying blockchain mechanism, this entry will be validated first by Bob, Carol, and Dwayne and if successful, they update the same in their respective copies of public ledger.
  • Now as a second transaction, suppose Alice tries to transfer $80 to Carol. This is an invalid transaction as Alice only has $60 left in her account. Thus, the validity of transaction fails and this entry will not be added in the public ledger.
  • From above, when a new transaction or an edit to an existing transaction comes in to a blockchain, generally a majority of the nodes within a blockchain implementation must execute algorithms to evaluate and verify the history of the individual blockchain block that is proposed.

    If a majority of the nodes come to a consensus that the history and signature are valid, the new block of transactions is accepted into the ledger and a new block is added to the chain of transactions. If a majority does not concede to the addition or modification of the ledger entry, it is denied and not added to the chain.

    This distributed consensus model is what allows blockchain to run as a distributed ledger without the need for some central, unifying authority saying what transactions are valid and (perhaps more importantly) which ones are not.

    Aspects of Blockchain
    Protocols for Commitment – This ensures that every valid transaction from the clients are committed and included in the blockchain in a finite time.
    Consensus – This ensures that local copies are consistent and updated.
    Security – The data needs to be tamper proof. Note that the clients may act maliciously or can be compromised.
    Privacy and Authenticity – The privacy and authenticity of transactions need to be ensured.

    Overall, the blockchain technology has the potential to revolutionize several industries from advertising to energy distribution. Its main power lies in its abilities of not requiring trust and being decentralized.

    In future posts, we will dive deeper into the various concepts of this technology. Stay tuned! 🌝

    IoT : Interfacing physical realm with digital space

    At the dawn of new digital age, the physical existence will co-exist with a digital avatar for the entire creation. The fusion of the physical with digital will usher a new community of digital beings. The “Internet of Things” popularly known as IoT will embark humanity on a journey of connected devices an extension of our own existence.

    Sensitive machines: The machines which were tools in the old age will assume the role of assistants, colleagues, friends and even boss in the new digital age. IoT will empower the machines with the power of expression. The sensors will germinate profusely and spread over the entire landscape with power and dimensions far beyond human perception.

    The machines with the ability to express will produce oceans of data. We will know the state of everything around us. The palpitating heart, count of steps, health of car, quality of air, purity of water, the age of appliances, fatigue of machines and myriad of different types of data. The question we have to ask ourselves, what will we do when we know so much about our space ??
    Cognitive intelligence: The sensitive machines with artificial intelligence such as visual perception, speech recognition, language translation and decision making will be capable of performing tasks we never envisioned while creating them. ‘Smart’ will be the de facto prefix for all digital identities. We will stay in ‘Smart’ homes, drive ‘Smart’ cars, operate ‘Smart phones’, recharge ‘Smart’ meters, deploy ‘Smart’ grids and live in ‘Smart’ cities.

    The smart cars will talk to each other to drive efficiently, alert the garage on arrival, perform self -diagnostic check and share health report. The smart grid will share the electricity optimally, smart meters will send digital meter readings and smart appliances will save power. The smart AC will inform you about depleted air filters and even send a request to the service center for repair.

    The human intervention in maintaining and controlling machines will reduce to an extent of ordering using voice commands or instructing using text messages.

    Omnipresent Clouds: The oceans of data produced by new digital population will need a fat belly and powerful digestion engine to produce insightful perspective about managing life in new digital age. The cloud data centers powered with super computational powers will be leveraged by connected digital beings for leading a ‘Smart’ lifestyle.

    The anticipation from cloud to take in vapors and bring down rain to support life will be equivalent from ‘Smart’ clouds to take in data and bring down useful information to manage life in a ‘Smart’ city.

    New Digital Identity: In the new digital space the classification between living and non-living will be diluted to a single digital identity. The ‘Smart’ interconnected machines will disrupt almost all aspects of human functioning which includes healthcare, finance, education, driving and socializing.

    The digital community will experience social networking at a new level. The modern chatting platform groups will have complete digital family including our cars, appliances with an ability to remotely connect with them. Bits 0/1 will be the new universal language that will connect one and all. Machines and humans will come together at all levels to accomplish goals at scales never imagined before. The operating efficiency of ‘Smart’ machines boosted with creative ingenuity of human beings will push the boundary of limitations.


    Evolution of Internet of Things: Internet evolved with the development of ARPANET to send a message from one destination to another. During the maturity of computers from personal machines to handheld devices, the internet became the collective consciousness of the entire world.

    The World Wide Web, influenced the global community, disrupted social networking, elevated human knowledge and changed the rules of the game for human organizations. It brought the world closer.

    With billions of digital beings added to this interconnected mesh, the power of connectivity will assert it’s influence more than ever in new digital space. The ‘Smart’ machines will manage our lives, leaving time for us to focus on more complex problems.

    The ‘Smart’ machines looking after our survival will allow us to afford time for diving into our consciousness and working on our well being. The ‘Smart’ machines in the new ‘Smart’ world will serve humanity to embrace self-realization and strive for enlightenment. The future leaves the onus on us to either evolve into ‘Super’ beings or get overwhelmed by the newly found digital consciousness.

    IoT : Cloud @ Edge

    In the previous blog IoT: Interfacing physical realm with digital space we were introduced to a new digital age which will usher a community of digital beings. The humans and machines will be part of an interconnected mesh of billion of digital entities.

    The new connected ecosystem of smart machines handling critical functionality of our lives poses challenges in networking, security and computing. In this blog, we will uncover cloud and edge computing and how they help in addressing challenges of IoT. The design principles mentioned in this blog are artifacts of my experiments with IoT.

    Design Principles :
    The fusion of cloud with edge devices should adhere to certain design principles in order to build a securely connected ecosystem, using distributed computing and utilizing network optimally to reduce the cost of operation and improve productivity.
    1. Cloud Computing :
    IoT uses the power of the cloud for costly operations like data mining and predictive analytics, upstream and downstream messaging. The machines send the data captured by sensors to cloud services deployed on the internet. The machines directly send and receive messages from and to the cloud. The evolution of embedded computing not only allows to monitor devices using sensor information but also control them using actuators. The machines are thus exposed to security threat as they are directly connected to the insecure mesh of internet.

    The latency in communication from cloud to the machine makes the machines less responsive to change in their local environment. There are situations where the machines have to be quick in decision making and respond to critical change in local environment spontaneously

    2. Edge Computing :
    Edge computing is localizing certain kind of analysis and decision making capabilities enabling quicker response time, less affected by network latency, reduced traffic and distributing intelligence between cloud and edge devices. It addresses networking security and computational challenge. Thus it helps in realizing the full computational potential of the complete connected platform.
    Cloud and Edge

    3. Connection :
    Devices in the local ecosystem are connected to each other using WSN(Wireless Sensor Networks) which are not exposed to the commercial internet. The gateway of local environment creates an ad hoc wireless network which connects all edge devices(sensors, actuators, etc). The gateway also acts as an access point for global internet access to edge devices. The gateway deploys a firewall for all communication with external internet services. This insulates the local ecosystem from the global internet and secures inter-device communication from external attacks. This design also decentralizes the network topology to handle single point of failure.

    4. Integration :
    Integrating embedded devices, sensors, software applications, external services and cloud services using resilient messaging infrastructure is critical for operational efficiency of an IoT solution. Evolving communication infrastructure is pushing the boundaries of IoT.

    There are short-range communication networks like DSRC(Dedicated Short Range Communication) which is a short to medium range communication channel specially designed for machine to machine communication. It comes with its own set of protocols and operates at 5.9 GHz band exclusively dedicated to it. This type of networks creates WSN(Wireless Sensor Networks) for connecting entities in the local ecosystem.

    LPWAN(Low Power Wide Area Network) like NarrowBand-IoT, 5G is new age radio technologies designed for long-range cellular communications and low power consumption. The low power communication comes as a blessing for embedded devices with limited power capacity.

    5. Messaging :
    The messaging infrastructure is the backbone of any IoT solution. The messaging framework should have certain attributes in a connected solution ;

    Lightweight – IoT solutions have myriad of digital entities connected in a mesh. In order to reduce the load on traffic, the packets should be lightweight. Embedded devices with limited capacity can afford messages with small code footprint for best performance.
    Local and Global Communication – Deploying a WSN for communication in local environment reduces long distance communication with the cloud, thereby reducing latency and network traffic.
    QoS(Quality of Service) – QoS is a technique to handle network requirements and manages resources. IoT solution encompasses a wide range of communication types ranging from guranteed device to device call, the device to message broker, cloud to user devices and vice-versa. QoS tailors the network requirements by managing traffic delay, jitter, packet loss and bandwidth. This ensures that the network is used optimally.
    Security – Wide range of communications makes the system vulnerable to attacks. The criticality of functionality and sensitivity of data shared among entities demands best security practices to mitigate risks.

    6. Security :
    The security considerations for IoT solution comprises of secure communication, robust embedded platforms, insulation of environments. There is still a long way to go in building a secure infrastructure for IoT. Some principles at the core of building security framework are :
    Isolating environments – In order to ensure mitigation from external attacks we have to insulate local environment from global ecosystem. Connecting the edge devices using secure WSN is essential for intra-device communication. The gateway acts as the only access point to the external internet. We deploy a firewall at the gateway for filtering messages.
    PKI(Public Key Infrastructure) – Setting up a PKI is essential for long-range communication between entities like the gateway, cloud, client devices, backend servers. PKI ensures secure electronic communication by managing public key encryption and distributing, storing, creating identity certificates also known as digital certificates or public key certificate.
    Virtualization – In a smart city or huge production plant embedded devices are located remotely like on highways or tunnels. In case of system crashes rebooting remotely is not possible and it is a cumbersome task to repair it physically. Virtualization allows hosting of a number of VMs(Virtual Machine) on same SoC(System on Chip) completely sandboxed from each other thereby ensuring that mission-critical services continue to operate even if one particular component fails. For example, if JVM crashes it has no effect on critical Linux virtual machine running on the same chip.

    Smart Manufacturing

    Use Case – IoT in Manufacturing :
    IoT is evolving into a disruptive force in all sectors including automotive, manufacturing, healthcare. In this blog, we will understand the current problem statements of manufacturing industry and how IoT has potential answers to those problems.

    Problem Statement: In order to reduce the operational cost and maximize the operational efficiency, manufacturing industry has identified these improvisation areas :
    Gain end to end visibility across entire production process.
    Connect production to the core business.
    Build responsive manufacturing to meet customer needs.
    Real-time asset health monitoring
    Predictive maintenance.
    Logistics and Supply chain
    Solution Overview: In order to use IoT to answer the above problem statement I have implemented my design principles for crafting a solution for smart manufacturing :
    Connecting production units with operations and business systems using embedded devices and software applications.
    Real-time monitoring using sensors for control and optimization.
    Message brokers for sending right information to the right person at right time.
    Machine learning algorithms to detect pattern and anomalies in the production process.
    Predict maintenance and flag machines before downtime or actual error.
    Integrating predictive information as continuous data feed with factory’s existing systems and operational process using REST API(producers and consumers)

    This is one of the applications of IoT showcasing the power of the cloud and edge computing. With distributed computational tasks and tailor-made network configuration, IoT can provide a disruptive and cost-effective solution in diverse sectors. The weapons of a connected ecosystem, real-time data and decision making based on predictive analysis empowers industries to operate consciously and innovate effectively.

    The Oblivious “Knowledge Worker”

    We hardly ever wonder, how we, the hunter-gatherers living in temporary shelters have evolved to built mega cities of the modern world. The physical infrastructure supporting modern life took years of construction and maintenance. The workers have toiled for ages to build a sustainable infrastructure to support and enhance the human experience.

    In last few decades, the physical infrastructure proved insufficient to fulfill the growing needs of human society, thus paving way for the development of a digital ecosystem to boost and in some cases replace the existing infrastructure.

    The smart cities are built on the efforts of a new age digital workforce. As per KPMG, the current knowledge worker economy is estimated to employ 240 million at USD 9 trillion or 27% of global employment cost. It is estimated that digital technologies will automate almost half of the currently performed activities by 2025, with the productivity of USD 5.5 to 6.4 trillion in equivalent labor.

    The oblivious knowledge worker is unable to fathom the rapidly changing dynamics of the digital ecosystem with his obscure vision.

    Division of Digital Labor:

    The Industrial Revolution exploded the rate of development of physical infrastructure. The competing human societies diversified their skill set to meet the ever-growing demand. The largest impact of Industrial Revolution was increased division of labor, which bought specialization, which enhanced productivity.

    Similarly, the Digital Revolution has triggered division of digital labor into groups with specific skills and knowledge of specific tools. The division of digital labor has increased production, improved quality and reduced time to market for software products. Market economy promotes efficiency and division of labor facilitates efficiency. The various business across the spectrum are migrating to digital to remain competitive. In this intense race, specialized digital labor gives them the desired horsepower they need to stay ahead.


    The Oblivious knowledge worker has to be vigilant about the new roles mushrooming in the digital space. He has to compartmentalize the responsibilities, develop skills and learn tools to fit in the new role. The worker in a specific role does not operate insulated. He compliments other workers in different roles thereby diluting the boundaries of roles at times to achieve the common goal. Thus the worker with a holistic perspective of the story has a better understanding of his role in context to the big picture.

    Karl Marx in his famous work Economic and Philosophic Manuscripts of 1844 has argued otherwise. He said, “the division of labor renders him ever more one-sided and dependent, bringing with it the competition not only of men but also of machines. Since the worker has sunk to the level of a machine, he can be confronted by the machine as a competitor”. His apprehension proves to be rational in the times of robotic automation when mundane tasks and narrow skills are in danger of extinction. The oblivious worker should progressively add new skills to his arsenal in order to remain relevant in times of automation.

    Division of Intent:
    There is another facet of the division of digital labor introduced by the increased layering in software applications and solutions. Each layer should be built by workers with different intent to convert innovative ideas into quality products running on a wide portfolio of hardware. The new digital age has embarked mutation of mechanical and electronic machines into software embedded utilities.

    The user’s interaction with apps is not limited to laptops and mobile devices anymore, it has proliferated to watch, home appliances, car, to name a few. The time spent interacting with software is distributed among the user’s hardware portfolio. The divided digital labor working at different layer helps in realizing pervasive computing, where the idea is transformed into ubiquitous software, available to the user across different platforms. The oblivious worker needs to understand the idiosyncrasies of a worker deployed at a specific layer.

    Platform workers: Platform is an environment in which the code executes. It provides an abstraction to application developer at different levels of hardware, OS(Operating System) and runtime libraries. A platform worker is closely acquainted with hardware components and operating system internals. This class of software worker needs to have a holistic view of the system to equip the apps to squeeze the last drop of computational power. Their agenda is to make the platforms ubiquitous, empowering the developers to run the same app across the hardware spectrum.

    Framework workers: A software framework provides a standard methodology to build and deploy apps. Every framework adheres to certain principles and so it brings peculiar nature to the apps developed using the framework.
    The workers developing a framework like support programs, compilers, code libraries, application programming interface(APIs) consider creating reusable components that smoothly to fits in custom code and is easy to train on for the first time application workers. The framework workers craft framework components with best coding practices and design principles. The application worker using the framework are forced to adhere to coding standards by design, not by choice.

    Application workers: Applications or apps are perceptible elements of functionality that are leveraged by the user. They are written for different platforms like Android, Windows, iOS, World Wide Web. The software workers at this layer should work to rapidly transform business requirements into a workable piece of code. The idiosyncrasy of application workers is ‘Agility’. Their agility gives the business layer flexibility of requirement change and architects of technology change on the move. They are empowered by tools, frameworks, development kits to deliver quality, maintainable and extensible code.

    The Robotic Renaissance:
    The Industrial Revolution saw the rise of machines like “Flying Shuttle”, “Spinning Jenny”, “Steam Engine” and others to reduce labor, improve efficiency and increase production. The craftsmen were driven out of business and were later forced to work in the same factories. The products were manufactured faster and at much larger scale.

    The oblivious knowledge worker is experiencing a similar competition from an upsurge in automation. The robotic process automation has embarked on its journey of automation at the presentation layer. At this layer, it aims to automate redundant and mundane tasks that are pattern based. The bots are learning to replicate low skill human tasks like customer service, data entry, the first layer of troubleshooting. The products like Blue Prism, UIPath, AutomationAnywhere have initiated the first wave of automation of rule-based tasks thus reducing the value proposition of business process outsourcing companies.

    The proliferation of automation is not limited to low skill jobs only, the fusion of cognitive intelligence and process automation challenges even the highly skilled knowledge workers. The learning bots trained in cognitive skills will be able to identify patterns in processes and automate them without human intervention, the art called “process mining”. The outsourcing and offshore model of business will be under pressure from automation forces.
    System (1)

    The current factory system in digital industry following the “assembly line” approach where each worker is concentrated on only a specific repetitive and monotonous task will crack in the wind of automation. The knowledge workers will reskill to perform more innovative tasks. The modern startup system, in which a small group of knowledge workers working on innovative solutions or building core competency, will challenge the monolithic factory systems. This will usher an era of mergers and acquisitions, thus synthesizing small groups with diverse core competencies.

    The enlightened knowledge worker will not be oblivious anymore. He will strive to discern his role, build advanced competencies, narrow down potent problems and build innovative solutions. The enlightened knowledge workers will build the smart world of the present knowledge age.