Unstoppable Data for Unstoppable Apps: A DATAcoin Whitepaper Draft

There’s a revolution brewing in computing infrastructure. The winning architecture will be decentralized apps (Ðapps): Front-ends served to the browser from decentralized storage, and back-ends implemented as smart contracts living on the blockchain.

This stack has many desirable properties: It makes applications naturally resistant to distributed denial-of-service attacks, it adds transparency and security by keeping an untamperable transaction log, it has zero reliance on commercial services, and it enables easy peer-to-peer transfers of value in every application.

But what serves this new breed of applications with data? How to make an instant messaging Ðapp capable of delivering a million messages per second? What about a stock trading Ðapp? Vehicle fleet tracking? Factory automation? Air traffic control?

At first sight, blockchains as decentralized databases seem perfect for data distribution. (We’ve even demonstrated the promise of this pattern in a previous blog post.) It’s a powerful concept: write a piece of data to a smart contract, and all nodes in the network get a copy. Front-ends can watch the contract data on a local node, and update accordingly. Unstoppable! Ok, data delivery is a bit slow due to mining delay, but hey, still unstoppable!

Unfortunately, it’s also unscalable. My granny’s smart fridge produces more data than any current public blockchain can reasonably handle. Transaction capacities are low, and writing any more than a miniscule amount of data into the chain incurs prohibitive gas costs.

If only there was a real-time, low-latency, scalable, peer-to-peer data transport layer sitting on top of the blockchain to handle the heavy lifting! Ðapps as well as centralized apps could then be served with unstoppable data by the network. The most important events — or aggregates calculated from the raw data, as in this post — could still be delivered to the underlying blockchain. The chain would also be perfect for supporting the operation of the data network by facilitating incentivization and permissioning.

My granny lives on a small pension, but she’s pretty clever as far as money matters go. What if the data network were powered by a usage token called DATAcoin, and we added native support for data monetization? Now my granny could offer her smartwatch heart rate data feed for sale, and if there’s enough grannies out there, the pharmaceuticals would certainly be interested. Or I could subscribe to her heart rate, just to know she’s ok. She’s made a smart contract to incentivize her doctor to keep her alive for another 100 years, but still!

For months, we’ve been developing the idea of this network and its related ecosystems. We have converged to the concept of DATAcoin and a technology stack that could become the global real-time data backbone for decentralized apps. We humbly present a whitepaper draft on “Unstoppable Data for Unstoppable Apps”.

As always, we would really like your feedback. Please come and talk to us on Slack if you have ideas, comments, or suggestions. You can also reach us on Twitter.

Streamr brings U.S. stocks to Ethereum blockchain

What are the future implications of massively distributed, decentralised databases enabled by blockchains? Will it some day be possible to have a significant share of new data generated in the world – data from IoT, social media, stock exchanges and so on – pass through a blockchain-powered system for superior persistence, security, and peer-to-peer distribution? Data could be easily available, transparent, proven, and not controlled by giant corporations.

While present-day blockchains aren’t nearly scalable enough to ingest big data, we at Streamr are working on ways to make the above vision happen sooner rather than later. Stay tuned for more information about that in the near future. Meanwhile, the concept of real-time data flowing into a decentralised system can already be easily demonstrated with the existing Streamr platform and its built-in Ethereum integration.

For this week’s demo, we decided to inject the U.S. stock market into the blockchain. Well, not the whole market just yet, we’ll start with the stocks in the S&P 500 index. For now, direct input of data into the blockchain, combined with current blockchain scalability means that the idea can only be explored on a miniature scale, to avoid spamming the Ethereum blockchain and paying excessive gas costs.

In the demo, you can see a process that ingests streaming trade data from NASDAQ, aggregates the data into 15 minute OHLC (open, high, low, close) “bars” by stock symbol, and submits those bars to a smart contract deployed on the Rinkeby testnet. Data for all the stocks are reported in a single transaction, in order to avoid spamming hundreds of transactions every 15 minutes.

What effectively happens is that everyone running a full Ethereum node will automatically have fast and verifiable local access to the published data points. A smart contract holds the data, and it can be queried by anyone to retrieve the events posted earlier (to display a chart, for example).

Here’s the live canvas (full screen, or open in editor):

The smart contract is at 0xcc85a5f3ecc0ae62023e35faeb9c72fea25d75d6, and its source code can be seen on the canvas inside the StockTicker contract. Here’s an example bit of web3 code for querying the Apple stock price history and watching for new events:

// Symbol to watch
var symbol = 'AAPL'

// Contract address and abi
var address = '0xcc85a5f3ecc0ae62023e35faeb9c72fea25d75d6'
var abi = [{"constant":false,"inputs":[{"name":"symbol","type":"bytes32[]"},{"name":"open","type":"int256[]"},{"name":"high","type":"int256[]"},{"name":"low","type":"int256[]"},{"name":"close","type":"int256[]"},{"name":"time","type":"uint256"}],"name":"setPrices","outputs":[],"payable":false,"type":"function"},{"inputs":[],"payable":false,"type":"constructor"},{"anonymous":false,"inputs":[{"indexed":true,"name":"symbolIdx","type":"bytes32"},{"indexed":false,"name":"symbol","type":"string"},{"indexed":false,"name":"time","type":"uint256"},{"indexed":false,"name":"open","type":"int256"},{"indexed":false,"name":"high","type":"int256"},{"indexed":false,"name":"low","type":"int256"},{"indexed":false,"name":"close","type":"int256"}],"name":"Price","type":"event"}];

	 	// Convert symbol to hex bytes32
		symbolIdx: web3.fromUtf8(symbol).padEnd(66, '0')
		// Fetch complete history of events
		fromBlock: 0,
		toBlock: 'latest'
	function(error, event) {
			// Convert integer BigNumbers back to double-precision primitives
			symbol: event.args.symbol,
			time: new Date(event.args.time.toNumber()),
			open: event.args.open.toNumber() / 10000,
			high: event.args.high.toNumber() / 10000,
			low: event.args.low.toNumber() / 10000,
			close: event.args.low.toNumber() / 10000

Note that we originally used a string field to index the events, but ran into this bug when trying to query the events in web3. We worked around the issue by using a bytes32 representation of the symbol string for the purposes of this demo.

While this example demonstrates the idea of decentralised streaming data delivery in the future, we’re currently working with Oraclize to enable smart contracts to obtain data from Streamr securely on a per-request basis, and on many different blockchains. This will be a topic for an upcoming blog post.

Overall, we begin to see hints of how a decentralized system for storing and transmitting data might work in practice. And of course, there is much more that can be done: we have not even scratched the surface of Ethereum’s other protocols, nor have we explored Polkadot Parachains or other multi-chain scaling schemes.

Decentralised data feeds are a new and exciting frontier, and we could not be more thrilled that the state of working technology already allows us to get this far! As always, we invite anyone who is as data-crazy as we are, to join us on our Slack, and on Twitter.

Streamr and Ethereum, or how we saw the light

It was the summer of 2012 when we put our thinking hats on to try and solve a particular problem that was collectively driving us more than a bit mad: It was extremely difficult to build and even more so to test trading algorithms based on so-called “real world” data. United by our hard-won comfort in the adrenaline-soaked world of high-frequency trading, we thus decided to build a set of tools to make our lives easier. This was the first version of Streamr, born as a way to assemble and distribute low-latency data feeds, and to rapidly build models and algorithms which monitor and react to this data.

Fast-forward a few years, and the Internet of Things is beginning to emerge. From the earliest steps, we intuited that this was something important, that perhaps the whole world would, in its own way, be moving in the direction of trading and finance. After all, what is the point of so many sensors, if not to provide real-time data and “identify” signals on the fly? This would naturally lead to algorithms taking automatic action to grab those fleeting-but-profitable opportunities, for the benefit of both individuals and businesses. And we were far from wrong: streaming analytics is now growing into one of the hot topics in the software business.

So we built a business around real-time data and streaming analytics. We created a functional and scalable low-code SaaS platform where it is easy to develop and deploy real-time microservices. We’ve gathered up a number of good customers, and there are many fascinating use cases in development. But until now, something has been missing from the picture.

Our ultimate vision is that everyone is able to use our platform and tools, connecting real-time data sources to other popular services and platforms, generally performing magic to solve customer problems and even create totally new services. But to accomplish this, another layer is needed, which provides both community and a mechanism for trust.

By late 2016, our home town of Helsinki was a true hotbed of innovation and startup activity. When a long-time friend introduced us to some interesting characters in the blockchain community, a number of illuminating conversations transpired, and we found ourselves thinking out loud:

Hey, there’s this wonderful community of tech-minded people who have created a decentralised, trustless network where you can have immutable facts mined and set in stone (or the digital equivalent) for anyone to see. And there’s yet another protocol where you can have computer code run in the same decentralised, trustless fashion, and all it takes is something called “gas”.

Our minds were blown. It’s not that we hadn’t heard of Bitcoin, blockchain, Ethereum, and other related technologies before, but it took a bolt of lightning for the reality to fully click. We already knew how to work with real-time data, and how to make life easy for those developing algorithms on top of streaming data. But what if we applied this to a decentralised computing machine? Could we ultimately offer the world a trust-free service with easy access to a real world data, and a killer usability layer? And what, in fact, would that mean?

After sleepless days and nights coddling our brains while pondering the profound and almost magical ramifications, the missing piece of the puzzle indeed fell into place for us. We could build this. And since we are as excited about Ethereum as we have ever been about anything that we have seen, we intend to go “all out”, as they say. At EDCON in Paris this year, I got on stage, connected a smart contract to a live feed from Helsinki City public transport, and deployed the contract, all in 5 minutes or so. A simple pay-by-use demo on the Ethereum platform, all using existing Streamr tools. It was fun, we got lots of positive feedback and ideas, and made some fascinating new friends.

Here’s an embedded a video of the EDCON demo:

For those who didn’t make to Paris, let me first briefly recap what the demo is about. The action takes place in Helsinki, Finland, where each tram is equipped with a GPS receiver as well as sensors which measure the vehicle speed, heading, and many other quantities. Each tram transmits its location and the sensor readings to the public transport agency’s backend every second or so. The live data feed is available at no charge over the MQTT protocol to any company or developer out there.

Now imagine that you’re in charge of the public transport in Helsinki, and a decision has been made to outsource the running of the tram network to a third party. How do you set up a smart contract which automatically incentivises the tram operator for running a tight ship (if you forgive me the pun)?

After some thought, we decided that it’s best to keep things simple to start with. The city pays the operator incrementally based on mileage. A smart contract keeps track of the mileage run and makes the payments as the trams happily chug along. The demo shows how such a smart contract can be set up easily, connected to a live data feed, and deployed in the chain.

The demo touches on one important aspect of real-world data as input to a smart contract. In this use case — as in many others — the volume of streaming data is sufficiently large to overwhelm any blockchain. There’s simply too much data to feed everything in to a smart contract and have it processed by e.g. the Ethereum virtual machine.

What this means that some off-chain processing is needed. In the demo, we make use the Streamr platform where much of the required streaming analytics functionality is already built in. This is what we do:

  1. First we connect to the live data stream from the Helsinki transport agency (this is the MQTT feed from the trams).
  2. We visualize the live tram data on a module showing the city map with the live location of every tram. The purpose here is nothing but to make sure that the data looks sensible, and is something we can build on.
  3. Next we deploy the smart contract. Our aim is to build ready-made, reusable smart contract templates readily available in Streamr. Here, we’re using one called PayByUse. We can open the template code in a built-in Solidity editor in the Streamr front-end. The contract is initialized with information on whom to pay and how much (in wei) to pay per reported unit (traveled meter). When the process is running, a function called update will be called every once in a while to report usage to the smart contract, which calculates and makes the payments.
  4. We need to calculate the mileage run by each tram in the city. We get that done by calculating speed * delta time (equals distance, remember high-school physics?) from subsequent measurements for each tram. In the below image this is abstracted inside the ForEach module.
  5. As discussed above, we cannot feed in all the real-time data to the smart contract. What we do instead is to accumulate the little distance increments until a threshold (10.000 meters in this case) is reached, report that to the smart contract, and reset the accumulator back to zero. The contract makes the transaction where the agreed amount of wei is calculated and transferred to the tram operator every time the threshold is crossed.
  6. Now we’re pretty much done. Let’s save the workspace, press the start button, and see what happens. And voilà — the mileage starts accumulating, and as soon as a total of 10.000 meters is reached, update function call is made, and wei is transacted.

Below you can see a live Streamr Canvas (a workspace) with the above steps plus some visualisations added. It shows the actual real-time feed from the Helsinki city transport connected to a smart contract. You’re free to have a look, poke around, kick the tires. Note though that this demo connects to the Rinkeby testnet, and it doesn’t worry about things like secrecy (you’ll find the private key in there but please do not withdraw the ether, it will spoil the fun for all others😊).

You can also view the below Canvas in a new tab or in our editor (requires free signup). This environment is our experimental Ethereum-enabled environment; our production environment is here.

There’s of course many simplifications in here. In practise you’d want to take many other variables into account in the payment schedule: The number of passengers, keeping to the timetable, even the ride quality, and so on. In many use cases you would want to have a good look at the data provenance (in here we trust the public transport authority to provide clean and truthful data). And there’s a good question of how we can prove that any off-chain processing achieves what is agreed, given that the such processing by definition takes place outside the blockchain proper. These are not easy questions, but we know there’s a lot of experimentation on interesting solutions taking place out there. For the sake of the demo, we’ve skimmed over the questions; not because these are not important, but because the all the solutions are not there quite yet.

As we see it, the EDCON demo is but a taster of what we we’d like to bring to the table. These are the earliest of days, and we don’t yet know exactly where we’ll end up, what our role might ultimately be, or much else. We certainly don’t claim to have all the answers, or yet have the deep technical expertise in the space that some others do (by the way, we profusely thank those people for their public knowledge sharing!). But we’re willing to learn, exchange ideas, and partner up with those who truly do know. These are the most exciting of times, and we can’t wait to contribute our share!

We have recently set up a public Slack. Please join and let us know what you think.

EDIT: Migrated demo from Ropsten to Rinkeby testnet

Streamr in top three in Arctic15 — see Henri’s pitch!

We made the case for Streamr in the Arctic15 pitching competition on June 3rd in Helsinki. The above video shows Henri, our CTO, on the stage in the final round. Henri was on the roll in front of a jury chaired by Tim Draper. We finished in the top three out of the 400+ startups took part!

Arctic15 is an exciting newcomer in the startup event scene, and it’s already attracted an enthusiastic following. This event brings together global investors and angels with growth companies and startups from the Nordics and Baltics.

In case you really hate videos, this is the gist of our pitch:

Almost any business runs the risk of losing value and money by not reacting to new data immediately. We’re on a mission to fix this.

Streamr was originally built as a trading platform that monitors millions of tiny market events and reacts automatically as and when needed. Outside the financial domain, the real-time events could originate from sensors or social media, or they might represent weather data or geolocation coordinates.

But building a scalable streaming back-end and developing the algorithms that instantly react to incoming data can be a very complex process which consumes time and money.

What we want to do is disrupt this long process. With Streamr, a developer or enterprise gets up and running in minutes instead of months. We provide a drag and drop interface, where a user can quickly and easily create micro-services and algorithms to power real-time applications.

Typical use cases are anomaly detection, different kind of alerts, and situation rooms. You’ll find the data and algorithms relevant to your business in the Streamr marketplace. It’s like the App Store for real-time data.

The Streamr platform is up and running and was launched in Arctic15 as a beta for the general audience. There’s revenue from pilot customers, and we’re growing to meet the needs of a global clientele. We’ll make your streams come true!

Hack to the future

Why world-changing ideas are now born in hackathons.


At Streamr, we’re very happy to have recently won the Kone Hackathon 2016 with our telepathic elevator concept. To celebrate that, we decided to share with the world why we think hacking and hackathons are so awesome for coming up with brave new ideas.

Hackers arise from the basement

Of course, hacking hasn’t always been something you celebrate. Ever since the 80s, mainstream media has used the word hacker as a slur for terrifying cybercriminals and basement dwelling teenage hooligans. But the roots of the word are elsewhere.

Hackers first appeared at places like MIT in the dawning age of electronics, with the word meaning hacking away, tinkering, or carving cool solutions for interesting technical problems. These ranged from model railways to telephony and increasingly, computers. Early hackers took pride in making those very primitive computers do as much as possible.

In our age, with everything ran by computers, skilled hackers have become a valuable part of the job market. Why? Because the best innovations come from people who are unstifled by bureaucracy. Hackathons are the perfect way for companies to allow the sort of creative thinking that’s required for seriously revolutionary ideas.

From coder challenges to bureaucracy bashing

The term hackathon first surfaced back in 1999, when it was used simultaneously to describe an OpenBSD open source UNIX project and a Java related challenge at Sun Microsystems. Now that the internet’s everywhere, these intense and marathon-like events have become hugely popular. No wonder, because for something that usually lasts anything from just a few hours to a weekend, the results tend to be very impressive. In fact, even Facebook’s like button is the end product of a hackathon, as is its newsfeed, as is tagging people in comments!


Facebook is just the kind of company that already loves the hackathon model – and the hacker mindset. And it’s not alone. Other companies that swear by hackathons include Google, Yahoo, Microsoft, Salesforce, Amazon, BBC, NASA, LinkedIn, Zappos, and Shutterstock. We’re not in any way surprised either that several high-flying startups have been conceived during hackathons, with EasyTaxy, FashionMetric and the Skype acquired GroupMe being just some of the best known.

It’s understandable, then, that the hackathon model is rapidly expanding to non-technical areas, too. A couple of weeks ago a group of organizations in Helsinki hosted a hackathon on basic income policy hacking. They asked participants to think of a policy that allows a simplified, bureaucracy slashing model for social security to be combined with extra benefits.

The good hack goes mainstream

The Finnish company Industryhack, which has created a successful business around organizing hackathons and was also behind the Kone event, is confident that more and more traditional companies will soon embrace the hackathon model.

Hackathons are great for encouraging new innovations and ideas, as long as the host company is committed and gives access to resources that aren’t normally available to outsiders. To make our own hackathons more compelling to participants, we usually require our customers to invest in the future development of the ideas, too,” Vilén says.

Concepts from Industryhack hackathons that have so far become reality include Cybercom’s Machinebook, a shared social network for humans and machines for improving maintenance, and Lassila & Tikanoja’s kimppanouto.fi service, with which you can get your unwanted possessions picked up for recycling. The latter is currently in beta testing.

Big, bold thinking makes magic

So, what about the Kone Hackathon? As Arthur C. Clarke once wrote, any sufficiently advanced technology is indistinguishable from magic. Our magic was to invent a telepathic elevator, which would totally eliminate calling the elevator and waiting for it.


As Henri Pihkala, our CTO, puts it, “we thought about how Kone’s vision of perfect People Flow would work in the movies. How would James Bond use an elevator? He would simply leave his apartment and an elevator would magically appear, just in time for him to elegantly walk right in.

We decided to think big and that paid off, with Kone giving us not just the top prize but also a big thumbs up for our attitude.

Streamr were the boldest and came up with a concept showing innovative thinking for a truly exceptional end-user experience,” says Jukka Salmikuukka, Kone’s Director of New Business Concepts.

The starting point for something great

In the words of Henri, the Kone hackathon was just a starting point: “the real work starts after the event.” Here at Streamr we’ll be doing our utmost to make the elevator happen, and it’ll be fascinating to co-operate with one of Finland’s largest corporations to achieve that.

Whatever happens, we’re confident this won’t be the last hackathon in which we participate. In fact, we find the model not just compelling but also rather addictive. Might it be the case that us creative developers thrive on impossible deadlines, stealthily snatched victories and sleep-deprived success rather than a 9-5 schedule and a safe monthly salary? Whether you agree or disagree, we’d love to hear your thoughts in the comments below.

mautic is open source marketing automation