Transaction Street (TxStreet) Backend

Christian Tucker

Backend Engineer
Blockchain Developer
Frontend Engineer
MongoDB
Node.js
Redis

About Transaction Street (TxStreet)...

Transaction Street was a real-time transaction visualizer that handled processing and ingestion of multiple blockchains to provide real-time data visualizations about the state of pending blockchain transactions. This involved a very interesting visualization where characters would wait in line at a bus stop which signified the estimated fill of each pending block before being confirmed on the network. Based on several calculations we could usually accurately determine which transactions would be included in which block and roughly how long they would be waiting in line. Characters could come out of "houses" which signified different DAPPs, so if for example a transaction was made that utilized Uniswap, your character would come out of Uniswaps house. Characters could also be customized through NFTs. If there was a transaction made from a wallet owning a supported NFT, it would utilize that NFT as the character.

(Re)Designing the Backend...

The backend of TxStreet is a bit of an interesting project, because I wasn't provided with full-reign of how it was designed and was required to stick to several constraints provided by the original developer. It was his project after-all and it came equipped with an already-functional backend, but it wasn't designed to scale properly.

I was tasked with re-creating the backend in a way that increased the overall scalability and allowed for simplification of adding additional blockchain implementations through abstraction. This meant drastically lowering database load, improving query times, and reducing costs associated with bandwidth.

Supported Features and Chains...

When I was building out the backend for TxStreet it was in the process of evolving from a real-time visualizer to also having historical transaction data, basically becoming a form of visualized Explorer. TxStreet was able to process and analyze transactions from multiple blockchains such as Bitcoin, Bitcoin Cash, Litecoin, Ethereum, Monero, and Polygon. Because of the scale required to ingest pending transactions for all of these chains and store them, database optimization and indexing became a large area of focus.

Each chain had it's own processors attached for handling which DAPP a transaction was associated to, for example, in Ethereum we could decode the contract call information of most popular contracts and use that to help visualize the transaction, we would also check for NFTs at this point. In the case of Bitcoin based chains, we would analyze the OP_RETURN. There's not much data that you can store inside of this data, but it was enough for several DAPPs to utilize bitcoin over the years.

Unique Challenges...

Because of the required scale of processing pending transactions from numerous blockchains we needed to have a large amount of processors running at any given time, but this yielded some unique challenges, because the actual nodes did not care about which processors were already aware of the transaction. This lead to conditions where every processor would be informed about the same transaction, sometimes more than once due to the nature of the nodes. To resolve this (and prevent processing duplicate transactions, as this was a time-consuming task) we implemented a locking mechanism utilizing REDIS that allowed only one processor to work on a transaction at a time. We had a separate service that was responsible for populating a queue of events, and the processors would pickup items from that queue in order of priority, ignoring all locked items. If an item was not resolved within a timeframe of being locked, it became unlocked. This was to prevent any issues where a service potentially hangs or dies. Since the item would be older, it would now have a higher priority and be picked up by the next available processor.

This method allowed us to scale to support Layer2 chains such as Polygon (previously MATIC) which supported a very large amount of transactional throughput. The issue at this point wasn't the backends ability to keep up, but the front-ends ability to render the thousands of characters. However, that was a different issue, and not one that is relevant to the scope of the backend project.

Partner With Christian
View Services

More Projects by Christian