Black box algorithms – the darker side of digital revolution

Whether you get a job or a mortgage, who you date or where you eat – algorithms increasingly determine the big and small decisions in our lives. We may not be aware, but behind the scenes, many companies, and increasingly governments too, are using or plan to use algorithms to automate bureaucratic processes and business decisions. Because algorithms are faster and more efficient than people. But do they always make better decisions? Can algorithms be misused for monetary gains at the cost of others? With these questions in mind, this article explores the darker side of black box algorithms that we as a society must address as our lives become increasingly intertwined with modern digital technologies.

Links to Relevant Books

Can machine learning transform trading strategies in financial institutions?

Over the past two decades, trading in financial instruments has seen a remarkable evolution from open outcry trading floors to on-screen trade booking and execution all the way to algorithmic and high-frequency trading. The success of machine learning and artificial intelligence (AI) seems like a natural progression for the evolution of trading. With that in mind, this article explores some of the practical examples where machine learning is already being used today in financial institutions, and the challenges in building intelligent autonomous trading systems.

Introducing RegTech and LawTech

Technology has had a transformative impact on our everyday lives, and continues to reshape the way businesses operate and interact with their customers. Yet the legal and regulatory industries have not fundamentally changed – at least not yet. But this is about to change. The rise of LawTech and RegTech promises to not only transform these centuries old professions, but also enable us to completely rethink and redesign the very basic concepts of trust, contracts, regulatory compliance, and access to justice. With that in mind, this introductory article explores how RegTech and LawTech are increasingly becoming central to overall digital transformation and compliance strategies.

Links to Relevant Books

Developer kit for AI hobbyists, enthusiasts and students

The power of modern AI is now available for makers, learners, and embedded developers everywhere, for just $99. At the GPU Technology Conference this week, NVIDIA announced the Jetson Nano™ Developer Kit – a small, powerful computer that lets you run multiple neural networks in parallel for applications like image classification, object detection, segmentation, and speech processing. If you would like to know more about Jetson Nano, I have a couple of videos for you.

Unlocking the business value in connected data: Neo4j

“Faster computer processing isn’t the only answer to optimising artificial intelligence (AI). Having the right data that maps relationships to feed AI is paramount, if you are looking to exploit trends such as personalisation” – Emil Eifrem, CEO of Neo4j, a leading open source graph database.

A couple of months back, I wrote an article on “Automating audits using graph technology”. In this article, we will explore what is a graph database and why it is relevant in today’s data-driven organisations. And if you would like to translate this understanding into something more concrete, in this article, we will also look under the hood of Neo4j and the amazing things that you can accomplish using this open source graph database.

Links to Relevant Books

Building a data lake using open source technology

This is a sequel to yesterday’s article on “What is a data lake and why do you need one?”. In this article, we will explore some of the key open source technologies that are enabling the design and implementation of highly cost-effective, distributed, resilient and scalable data lake architectures in many data-driven organisations today. More specifically, this article will cover the following technologies: Apache Kafka, Apache Spark, and Apache Hadoop (HDFS).

What is a data lake and why do you need one?

Data lakes are emerging as the most common architecture built in data-driven organisations today. A data lake enables an organisation to store unstructured, semi-structured, or fully-structured raw data, and process them for different types of analytics – from dashboards and visualisations to big data processing, real-time analytics, and machine learning. Well designed data lakes ensure that organisations get the most business value from their data assets. This article explores the meaning of data lakes and the business case for building one.

Links to Relevant Books

5G: Driving the automation of everything

2019 is the year 5G is stated to become a reality. 5G or the fifth generation mobile internet is not your traditional network. Rather it is a network that is optimised for connected and intelligent machines, with the capacity to support devices ranging from gigabits per second data needs to the ones with multi-year battery life. Software, connectivity and digitalisation are reshaping both industries and society at an ever increasing pace, and 5G is a going to be a strong catalyst. With that in mind, this article explores what a 5G powered world of automation might look like.

The ethical dilemma of using artificial intelligence and autonomous technology

We live in an age of rapid technological advances where artificial intelligence is a reality, not a science fiction. Every day we rely on algorithms to communicate, do our banking online, book a holiday – even introduce us to potential partners. Driverless cars and robots may be the headline makers, but artificial intelligence is being used for everything from diagnosing illnesses to helping police predict crime hot spots. As machines become more advanced, how does society keep pace when deciding the ethics and regulations governing technology? To address this question, this article explores the ethical dilemma surrounding the use of artificial intelligence and autonomous technology.