There are many challenges associated with performing algorithmic trades in the real world. Despite these challenges, there are a dozen well-known names in the algorithmic trading world and more players are trying their  hands in this business every year.

The nature of this industry can be characterised as heavily reliant on technology: whoever has developed and properly executed a novel technological advancement instantly gains an edge in the competition for faster and more lucrative trades. Accordingly, at the core of these companies’ efforts lies research and development into the cutting edge of financial technologies. In this article we will take a look into the most promising innovations in algorithmic trading and try to predict what the trading world could look like in the near future.

For the purpose of convenience we will group the innovations by the goal their developers are trying to achieve. These will be:

  • Improvements in speed. The faster one performs an order, the greater the amount of options they have to choose from. This category will touch upon developments in networking and hardware, low-level programming languages, and the collocation principle.

  • Research into alternative sources of data. There is quite a lot of crucial financial information one can extract from the loads of freely-available data. In this category we will describe the extraction of social network and text data on the Internet, opinion mining, sentiment, information spreading, image data analysis and some exotic sources of information.

  • Application of neural networks and machine learning. This technology has recently appeared in the scopes of algorithmic traders. We will look into financial machine learning systems and recurrent neural networks.

 

Improvements in speed

Improvements in speed  arguably comprise the core interest of all algorithmic traders in the industry. Since the majority of algorithmic trades these days are High Frequency Trades, the speed of order delivery as well as that of the order throughput are crucial factors of success.

 

Overall improvements in networking hardware and software

IGo West, a joint venture to build the fastest communication line possible between the Chicago and Tokyo trading hubs, was launched in 2016. The communication line was designed with an assortment of communication technologies, including microwave radio towers, high-speed fiber-optic datalines and reinforced undersea cables. The initiative is comprised of IMC, Jump Trading, KCG Holdings, Optiver, Tower Research, DRW’s Vigilant division, Virtu Financial and XR Trading.

Such technological diversity is used due to the tradeoffs different technological solutions have. Microwave communication has been actively used and developed for more than 60 years and is favored for its low latency of transmission. In real-life conditions microwave communication lines transmit information with approximately 2-6 times less latency than fiber-optic lines. This happens largely due to the fact that that radio waves propagate through air faster than light does through glass. In addition, radio towers can typically form the shortest communication line, while physical communication lines have to adjust to terrain geometry.

Fiber-optic cables, however, have a lot of advantages on the practical side. First of all, they typically have higher throughput (as in the amount of data that they can deliver in a certain time frame) than their microwave analogues. Secondly, they are far more reliable and require less maintenance. Microwave towers depend upon clear line of sight and suffer severe loss of intensity should various weather phenomena, such as rain or clouds, appear between them.

In the end, the team behind the project attempted to balance cost vs. benefit as they connected the two hubs, a problem complicated by the terrain challenges they faced. Keeping in mind the sheer cost and effort required to bring such a project to life, one can imagine the importance of latency in algorithmic trading.

 

FPGA chips

FPGA chips or Field-Programmable Gate Arrays are integrated circuits that can be configured by a customer after manufacturing (hence the term, “field-programmable”). These incredibly sophisticated hardware devices are quite often used in high-frequency trading applications.

No hardware or software engineer would argue that a task implemented in hardware would run faster than the same task implemented in software abstraction. Conversely, the software layer is the medium that allows hardware devices to perform various tasks. FPGA chips are devices that turn the tables a bit in that they allow customers to design and program their custom hardware logic into the chip’s so-called logic fabric without having to go through the actual hardware manufacturing process.

The process of modifying the architecture of an FPGA chip is highly-sophisticated: first, the user designs the structure and functional behavior of their target model in a Hardware Description Language (HDL), a specialized computer language that allows for computerized hardware model simulation. Then, a special software is used to generate a netlist - a description of a model’s individual hardware components and their connectivity. Further on, dedicated software uses the netlist to produce a map - a virtual representation of the target model’s hardware architecture, adapted to fit the chips’ specifications. This map can then be simulated and validated using various analysis techniques. Finally, once the map is validated, a binary file is generated and booted into the FPGA chip.

The advantages of using FPGA technology come from its ability to parallelize a lot of simple custom tasks rather than building a universal computation solution, like the general-purpose Central Processing Units we have in most consumer electronics. One of the typical FPGA applications in HFT trading scenarios is implementing the so-called marketfeed handler, a device that can parse and process multiple data feeds with astonishing speed. Here one can imagine an FPGA device as a factory with lots of workers that were taught to perform a single task, while a CPU would be a firm with just a handful of expert, broadly-taught workers. Hence, FPGA systems are best used in conjunction with other types of hardware and software to tackle different tasks with the best-suited technology.

One may cavil that Application-Specific Integrated Circuits or ASICs can be customly designed like their FPGA counterparts and produced much cheaper on a wider scale. ASICs are integrated circuits that are designed to perform a single type of task with a substantial workload.   However, ASICs cannot be modified after production and HFT calls for constant methodology reevaluation and an iterative approach. Traders constantly want to be able to retool their devices with the fastest working algorithms to stay on top of the game.

 

Colocation

Colocation is probably the simplest and most ingenious innovation we look at in this article. Why would one tussle with network latency between the servers when instead they could bring the servers together?

Colocation data centers lease server floor space with all the accompanying services: energy, cooling and connectivity, as well as physical and disaster security. Businesses of all sizes can take advantage of these services: they can typically rent any space they want: from a single rack to an isolated room with a custom access policy. Various connectivity tiers are also available to satisfy the different demands of renters. Many businesses find such services appealing: thanks to economies of scale they can deploy their server infrastructure in an advanced environment for a fraction of the cost of owning and maintaining your own server space.

However there is one more benefit to choosing colocation: renters can establish fast horizontal connections between each other. Thanks to that capability, many latency-sensitive industries have switched to colocation services and algorithmic traders are no exception. They cluster their trading infrastructure around that of financial data providers and trading platforms, taking full advantage of ultra-fast data connections, basically, within one building. In this way they are able to fetch data and execute orders as fast as possible.

Such an ingenious feature, thanks to its simplicity and effectiveness, has quickly gained popularity in various latency-sensitive industries, including trading. Recently progressive crypto-exchange HitBTC has opened their doors to algorithmic traders in the LD4 data center in London. This information can provide an insight into how fast the innovations are carried on into the crypto world.

 

Research into alternative sources of data

Accounting for the human factor has been an integral part of making trading decisions for as long as trading has existed. The core idea behind this practice is that successful trading decisions are typically made in accordance with some facts of nature or human behavior that are revealing to the trader.

In this section we will take a look at how people analyze these natural processes and human opinion to perform successful trades.

 

Social networks and text data

It’s no wonder that social media is at the forefront of the modern information exchange. Social media users are probably the largest and most involved market audience in the world: out of 4.4 billion Internet users, 3.5 billion actively use social media, which makes up an incredible 79% active audience. The audience is also relatively uniform, with the average user having 7.6 social media accounts. In addition, on average users spend 142 minutes a day on social media. All this info makes it obvious why social media can be used to analyze sentiment, social attitudes towards specific trading entities and events. With the number of traders naturally expanding and a greater degree of opinion vocalization on the Internet it makes perfect sense as a group opinion resource.

Sentiment analysis is also referred to as opinion mining. This technique of data processing has been known for quite some time, but appeared in the spotlight with the advent of two modern-day phenomena: social networks and machine learning:

  • Social networks have enabled and popularized mass opinion vocalization, resulting in enormous datasets of publicly available opinionated text-based information: for instance, every 60 seconds Facebook users post 510k of comments and update 293k statuses.

    Opinion mining technologies have followed this trend, permitting marketing and political analysts to extract valuable data on public opinion;

  • With machine learning algorithms, particularly neural networks, developers have benefited a lot from social networks: they can use these enormous amounts of publicly available text data to train and test their neural network algorithms.

    In addition, machine learning algorithms allow developers to decode opinion from data without actually formalizing specific rules for inferring meaning from content. 

 

Information spreading

Social media enriches opinion data with two additional qualities: locality and impact. When analyzing opinion data it is crucial to understand not only what people say, but what opinions attract attention and how they spread, both in terms of locality and hierarchy. As it is the case with all kinds of trading, getting this information first means getting an edge over your competition.

Social media allows data miners to easily track what area and language the information originates from and see local and global trends. Tagging, which is used throughout social media, does some of the work for the sentiment analysts, pre-labeling pieces of information and sorting them into categories.

Coupled with impact, or the amount of attention these pieces of information have collected with likes and views, social media becomes a very powerful resource in analyzing sentiment and predicting its direction.

 

Satellite imagery

One cannot go about describing alternative sources of data without mentioning the story of the RS Metrics company, founded by the Diamond brothers back in 2009. Coming from a background in finance and satellite data processing, they decided to combine their expertise to backtest a theory: what if the number of cars in the parking lots of Walmarts correlates to the company’s revenue? After three months of testing their proposal on the historical satellite images of Walmart, Home Depot, Lowe’s and McDonald’s, they had a match - the data co-aligned. They went on to found RS Metrics (with RS standing for “remote sensing,” a term from the scientific world) and got a big break in 2010 when Neil Currie from bank UBS was able to make an accurate prediction on Walmart stock price in his quarterly earnings preview.

These days satellite data processing has turned into a competitive industry with companies like Orbital Insight, Maxar (previously DigitalGlobe), Image Sat International, UrtheCast, Descartes Labs and Planet Labs competing to provide the most accurate and actionable insights from satellite images. Such data is used across a variety of industries to assess the extraction of natural resources, agriculture, large-scale production and the influence of natural phenomena as well as human traffic. With such powerful tools, only a client’s imagination is the limit of what these companies can do.

There have been concerns about the legality of using satellite data for trading insight: particularly, the lack of equal access to such data, as satellite images carry a hefty price tag. However, using such data does not break any rules: it is neither insider info, nor an illegally gained asset. With the satellite data industry growing so fast we can expect this resource to become cheaper, fast.

 

Application of machine learning

Even the least technology-savvy people have heard about machine learning technology (ML). In the last 5 to 10 years there has been an increase in the application of this methodology: it is used from recognizing device owners with their face to giving accurate healthcare recommendations. It is crucial to understand that machine learning concerns a broad area of scientific study of algorithms that perform tasks without needing explicit instructions, while the so-called neural networks, a technology that is somewhat similar to actual biological neural networks, is a subset of machine learning.

The benefit of this technology comes from not needing to develop or formalize a logic model for the application. In order to make a functioning neural network application, a developer just needs to take a sufficient data set of sample data and “train” the model - mark out correct and incorrect assumptions made by the software. The software uses this information to make ever-more-accurate conclusions until the necessary level of accuracy is reached. The process is fairly similar to that of artificial selection - only the necessary traits are carried on to the subsequent generations of the product.

With such a simplified method of developing a working algorithm for complex phenomena, it’s no wonder machine learning has made its way to the trading world. To get things started, we can return to a familiar story to explore neural networks - counting the number of cars in parking lots. When Tom Diamond was testing his assumptions about the number of cars in parking lots, he started off with an app that would count the number of clicks he made over the cars in the photo. But imagine you needed to count the number of cars in the parking lots of several distribution chains over tens or hundreds of cities. Here is where neural network algorithms come in handy: a model can be trained to recognize empty/occupied parking spaces. Modifying it to work in different lighting or weather conditions makes it even more versatile. The result is a software that can tremendously speed up data inference from satellite photos.

But the major point of ML’s application to trading are the trading algorithms themselves. Such algorithms typically produce the following information:

  • The amounts traded

  • The quantity of trades

  • The point of submitting a trade

  • The point of exiting a trade

But as it is with other complex systems, the devil is in the details. Hedge funds invest significant resources into researching and developing such algorithms and make sure their findings are guarded against the outside world. Their algorithms are some of the most expensive pieces of software in the world.

Recently, there has been an increased interest in Long Short Term Memory Networks and Gated Recurrent Unit Networks. These are different varieties of neural networks that have generated interest due to their functional capabilities in the training process. What can certainly be said about these trading algorithms in particular and trading algorithms as a whole is that it is unlikely that there is an algorithm that could provide noteworthy results for a prolonged period of time: the nature of trading processes is an ever-shifting thing.

In this article we have reviewed the most recent and significant practices and pieces of technology that are being researched and successfully applied in the algorithmic trading world. These innovations not only reward their owners and developers; people stand to benefit from such developments as a whole, as with time these innovations find ways into our daily lives. With the amount of resources that are being invested into research in the trading world, trading will soon join the military and the space industry as a workhorse that is pushing technological development forward.


TradeSanta is not a registered broker-dealer or an investment advisor. You must trade and take sole responsibility to evaluate all information provided by this website and use it at your own risk.

Join Telegram

Recommended content


Recommended Content

Editors’ Picks

TRON gains 10% in 2024, supply of stablecoins reaches over $50 billion in Q1

TRON gains 10% in 2024, supply of stablecoins reaches over $50 billion in Q1

TRON, a blockchain-based digital platform, has seen positive growth in the first quarter of 2024, as seen in a Messari report. TRON noted gains across several metrics like market capitalization, revenue and total value locked. 

More Tron News

XRP hovers near $0.50 as Ripple CTO addresses concerns related to stablecoin launch

XRP hovers near $0.50 as Ripple CTO addresses concerns related to stablecoin launch

XRP is hovering near $0.53 on Friday, spending nearly all week below $0.55. Ripple CTO David Schwartz addressed concerns on stablecoin and XRP utility on Thursday. 

More Ripple News

Terraform Labs set to restrict access for users in the US after recent ruling in SEC lawsuit

Terraform Labs set to restrict access for users in the US after recent ruling in SEC lawsuit

Blockchain company Terraform Labs said Thursday that it will restrict access to some of its products and services for US-based users as it expects to receive a court order soon in light of its legal battle against the US Securities and Exchange Commission (SEC).

More Terra News

Bitcoin Weekly Forecast: BTC’s next breakout could propel it to $80,000 Premium

Bitcoin Weekly Forecast: BTC’s next breakout could propel it to $80,000

Bitcoin’s (BTC) recent price consolidation could be nearing its end as technical indicators and on-chain metrics suggest a potential upward breakout. However, this move would not be straightforward and could punish impatient investors. 

More Bitcoin News

Bitcoin: BTC’s next breakout could propel it to $80,000 Premium

Bitcoin: BTC’s next breakout could propel it to $80,000

Bitcoin’s (BTC) recent price consolidation could be nearing its end as technical indicators and on-chain metrics suggest a potential upward breakout. However, this move would not be straightforward and could punish impatient investors. 

Read full analysis

BTC

ETH

XRP