The need to exchange information in real-time has never been more apparent. The changes occurring in everyday life due to the outbreak of the novel COVID-19 virus have been nearly unimaginable for many of us, and staying informed feels like a lost cause.

What does this have to do with why architects need an event portal? In the last few days, I have personally been wondering what I could do to help the world get more real-time information in order to make better decisions. While my colleagues and I have been practicing social distancing and self-isolation, I felt we could do more.

One of the things we have been using to stay up to speed about the spread of COVID-19 was Johns Hopkins Dashboard which is being used as a “source of truth” by news outlets around the world.

It occurred to me that the data are constantly changing, yet many media outlets are basing their reporting on old information. I wondered if others had wondered how to get this data in real-time and stumbled upon others who wanted this capability (no surprise that my idea was not original or unique!). So, like the technologists we are, we set off to make it happen, and our work exemplified why our new product Solace PubSub+ Event Portal is so useful when you’re collaborating on the development of event-driven systems.

It’s easy to sketch an idea on a bar napkin or coaster (like we did in the good old days pre-coronavirus), and banter about how awesome it would be. The challenge has always been taking it to the next step of specificity, i.e. figuring out what exactly it does, how and why it does that, how people or other systems interact with it, and how it could be implemented.

Over the past several years, there has been an ongoing joke at Solace that we are just “plumbers of data” because our event broker puts in place “smart pipes”. Imagine you’re tasked with building an airport and you need to design the systems by which water, steam, and natural gas are distributed. Could you imagine sketching out the routing on a napkin? How would you express the routing, diameter, and contents of each pipe? And how would you define the end user interactions, i.e. faucets, toilets, ice machines, etc.? You wouldn’t because the architects who design buildings use specialized tools that let them design and define what each building needs. Those designs are converted to blueprints, cut-sheets, and parts lists so the plumbers who’ll be putting the pipes in place have a reliable, fully informative picture of what they must build, and it gets built per design.

Let’s look at how an event portal helps application architects do the same thing for developers.

Creating an Event-Driven Application with an Event Portal

Our goal was to create an application that acted as a client to the Feature Service to collect data for the cases for each country and publish to a topic. Pretty simple right? Well, in the past, without any specialized tooling, we would have created a Confluence page that described the work to be done, the schema, the event topic hierarchy, and most likely a PowerPoint diagram that showed the “system actors” and how the data would flow. Yuck! API Management solutions are no help because they focus on synchronous RESTful APIs and not the event driven world. But, it’s 2020, let’s look to the future and use an event portal to see how it can ease the design of an event-driven architecture.

Step 1: Identifying Updates

The first thing we had to figure out is how to detect changes to the Johns Hopkins data set and understand the data formats and structures. It turns out that the John Hopkins page is powered by a product called ArcGIS that leverages Open Geospatial Consortium standard of Web Feature Service (WFS).

I began designing this new event-driven application using PubSub+ Event Portal. We were able to invoke the Feature Service in Postman, get a response, and use a JSON-to-JSON schema generator to reverse engineer the data models. Now that we had the data, we decided to look/poll for updates every 30 seconds and compare new data with old to identify changes.

These updates needed to be sent with the ability for the consumer to filter them by country, region or whatever attribute had been updated (e.g. confirmed diagnosis, recovery, or death). We created events using our Topic Best Practices Guide. This design, including some fruitful debates over the topic hierarchy, took about an hour.

Step 2: Enabling Developers & Creating the Applications

The next step was to get this blueprint in the hands of the people who could take advantage of it, i.e. developers. Since PubSub+ Event Portal supports the AsyncAPI spec, we were able to generate a code skeleton for each application we wanted to build. Over the past few months, Solace has been working to deliver an AsyncAPI code generator that would help Spring Boot Microservices developers easily create event-driven microservices.

Essentially all we had to do was download the AsyncAPI’s related to the application that we wanted to develop, run it through the code generator, and a cloud streams application would be generated with a comment that says “add business logic”. This meant that even an architect like myself could help by slinging some code to help us achieve our goal. Over the next few hours, we created several of these event-driven Spring Cloud Stream microservices that implemented the design captured by PubSub+ Event Portal.

Step 3: Get the Word Out

With the design in our event portal and component applications implemented, we were producing high value events that could be consumed by anything or anyone that wanted them. The next step was to document what we’d done and let people know about it. Since we used PubSub+ Event Portal to capture our descriptions and relationships around applications, events, and schemas from the very beginning, it acts as great source of truth for documentation.

For key clients who wanted to make use of the data, we gave them direct access to our event portal. But for other users in the community, we were also able to export a lot of the information out of the event portal and add it to a GitHub page for the community understand how to connect and what was available.

One lesson we learned was the need to make events and schemas publicly consumable so you don’t have to provide a “login” or copy information (which will become stale) into a GITHUB Manifest. That way the community would be able to browse the menu of available events and schemas available, all while viewing the data directly out of our “source of truth” that is synchronized as updates occur. For example, we thought a stream of data that looks at “U.S. Cases Confirmed” updates and offers up the percent of the population infected in each state would be interesting. We added it to the event portal, implemented another microservice… and forgot to add it to the GitHub page (this capability is now on our roadmap).

Conclusion

We were able to develop an ecosystem of event-driven applications that consumed and reproduced a set of Johns Hopkins data about the spread of COVID-19. Without PubSub+ Event Portal it would have taken around a week of total time, but the ability to easily collaborate on the design and implementation let us do it in one day. While not all enterprise projects will be this easy to deploy, I think it’s a great demonstration of how PubSub+ Event Portal can give organizations the power to dynamically respond to new requirements and changing conditions.

If you have any other clever ideas, you can try PubSub+ Event Portal for free and develop your own event-driven applications to share real-time data. We’d love to see what you can do with this toolset and are eager to receive any feedback you may have on our product.