Home Blog How event sourcing makes Omoik reliable, auditable, and insightful

How event sourcing makes Omoik reliable, auditable, and insightful

Event sourcing graphic

When building Omoik, the privacy-friendly survey platform, I faced a familiar challenge: how do you guarantee reliability, flexibility, and compliance for clients, without making life harder for myself or customers? My answer: event sourcing.

Event sourcing isn't just a technical pattern; it's a business enabler. It makes software simpler to maintain, easier to extend, and dramatically more useful for clients who need detailed analytics, dashboards, and audit trails. Let's dive into what that means in practice.

What is Event Sourcing?

I recently posted a bit of a deep-dive on event sourcing, so if you're not quite familiar, have a look at that post. Here's a quick summary!

Instead of storing only the current state of your data (like "survey X has 42 responses"), event sourcing records every change as an event:

  • SurveyCreated
  • QuestionAdded
  • SubmissionCompleted
  • SubmissionClosed

These events are stored sequentially in an event store (I use kurrent for Omoik). The current state of a survey, or any part of the system, is then built by replaying these events.

Why bother with this "complexity"?

For developers: Each piece of code only needs to handle a small, well-defined event. No more tangled business logic trying to account for every possible state. The only question you'll need to answer is: "Can I store this event?"

For businesses: You get a full audit trail, the ability to analyze trends over time, and dashboards that can answer "what happened, when, and why?" - even if you only think of these features years after you've started building your app.

Technical benefits: simpler code and smaller context

Event sourcing breaks down complex workflows into bite-sized events. Each event handler only cares about one thing at a time. This makes code easier to write, test, and debug. Let's show an example of handling a survey submission in Go:

// Keep track of the current aggregate
type SubmissionAggregate struct {
    Events     []cqrs.Event
    IsStarted  bool
}

// Check if the command has any impact on the aggregate and can be translated into an event
func (a *SubmissionAggregate) HandleCommand(ctx context.Context, command cqrs.Command) error {

    switch typedCommand := command.(type) {
    case *StartSurveySubmissionCommand:

        if a.Attributes.IsStarted {
            // Already started
            return nil
        }

        if err := a.ApplyEvent(ctx, &events.SurveyStartedEvent{
            SurveyID:  typedCommand.SurveyID,
            SessionID: typedCommand.SessionID,
            UserAgent: typedCommand.UserAgent,
            Language:  typedCommand.Language,
            Platform:  typedCommand.Platform,
            Url:       typedCommand.Url,
            StartedAt: typedCommand.StartedAt,
        }); err != nil {
            return err
        }
    }

    return nil
}

// Apply the event to the aggregate
func (a *SubmissionAggregate) ApplyEvent(ctx context.Context, event cqrs.Event) error {

    a.Events = append(a.Events, event)

    switch typedEvent := event.(type) {
    case *events.SurveyStartedEvent:
        a.IsStarted = true
    }

    return nil
}

You've got an aggregate. Its only job is to make decisions on incoming commands. The Aggregate contains all information you need to be able to decide if an event should be applied. Nothing more, nothing less. Simple! Each handler is small, focused, and easy to test. I prefer using a switch case, but you can very easily turn this aggregate into something that resembles this:

func (a *SubmissionAggregate) StartSubmission(ctx context.Context, cmd events.SurveyStartedEvent) error {

    if a.Attributes.IsStarted {
        // Already started
        return nil
    }

    return a.ApplyEvent(ctx, &events.SurveyStartedEvent{
        SurveyID:  cmd.SurveyID,
        SessionID: cmd.SessionID,
        UserAgent: cmd.UserAgent,
        Language:  cmd.Language,
        Platform:  cmd.Platform,
        Url:       cmd.Url,
        StartedAt: cmd.StartedAt,
    })

}

The implementing code is not very important, all that matters are the events. However you decide to store events (facts, things that have happened), the events themselves are the most important thing.

Business benefits: Analytics, dashboards, and compliance

Technical details are important, but if the business doesn't see the point and only sees a time investment, event sourcing isn't happening. Luckily, there are a few (very compelling) reasons for the business to also want to use event sourcing.

Analytics, even about the past

Because every event is stored, you can build new reports or dashboards at any time, even for data collected years ago. Want to know how response rates changed after a specific question was added? Add a new projector, feed it all the events it needs, and generate on-the-fly reports.

It's impossible to do this with a traditional (CRUD) application, because data can get overwritten, so you lose information about the past. You also can't retroactively create data in the past when you come up with new features.

These events are just things that have happened. What you do with them is up to you, but these things have happened.

Effortless dashboards

The business loves dashboards, because it gives them actionable data. Dashboards become easy to generate and update, because projections (summaries of event data) are always up-to-date and can be changed and rebuilt from scratch when needed.

When the business wants reports in a tradional application, you'll have to:

  • Perform (often) complicated database queries
  • Hope the data in the past was the same as the data in the present
  • Hope fields in the database haven't been overwritten at some point.

So yes, you can still make great reports in these applications, but it's much simpler in event sourcing:

type SurveyStatistics struct {
    NumberOfViews       uint64
    NumberOfSubmissions uint64
}

func (a *SurveyStatistics) ApplyEvent(ctx context.Context, event cqrs.Event) error {

    switch _ := event.(type) {
    case *events.SurveyStartedEvent:
        a.NumberOfViews++
    case *events.CompletedEvent:
        a.NumberOfSubmissions++
    }

    return nil
}

After you've processed all events, you'll have an in-memory model with the number of views and the number of submissions. I think you can agree with me that this code is pretty easy to understand, even if you're not a programmer or have never worked with Golang.

Audit trails for compliance, security, and trust

Every action is recorded, timestamped, and attributed. This is a huge win for clients who need to meet compliance standards, investigate anomalies, or simply build trust with their own users.

Clients can answer questions like:

  • "Who changed this question, and when?"
  • "Were any responses deleted?"
  • "What exactly happened during that incident last month?"

The events don't lie and cannot be changed, so you always see exactly how your data was changed.

Real-World example: Omoik

Let's look at a real-world example on how event sourcing makes business sense for Omoik. Customers always ask for new ways to analyse their data, new ways to collect this data, and new reports based on the submissions from their visitors. Also, when you write software, at some point, you'll deal with bugs, which can cause incorrect data and statistics. Event sourcing helps with this.

When rolling out these new features, I can simply add new event types without risky database migrations or fear of data loss. When running larger data migrations, you can test them as much as you want, but running them in production is always a little nerve wrecking. By using event sourcing, you can add a new event or change your projector, feed it with all relevant events, and produce data that serves your new purpose. There is no risk of data loss, because your source of truth are the events, not the projections.

Much like data migrations in new features, bugs can also have an impact on the data you can show your customers. If a bug ever affects data collection, I can replay the entire event log to restore correct analytics.

Another real-world example are custom reports. Clients often request custom reports on survey activity, even for periods before those reports were originally needed, and because every action is recorded as an event, I can generate these insights after they already happened. I can even answer highly specific questions like: How many visitors answered question 1, but not question 2, before we changed question 2 from "How did you hear of us?" to "How did you find our website?". With event sourcing, this is a simple task.

Event sourcing means Omoik can quickly deliver new capabilities, answer unforeseen business questions, and ensure data reliability, all while keeping the platform stable and trustworthy.

Should you use event sourcing?

Yes, event sourcing is more complex than a CRUD application to set up in the beginning, but the payoff in reliability, flexibility, and business insights is massive.

For developers: Writing and maintaining code is simpler. Bugs are easier to track down, and new features are less risky to add.

For businesses: You get transparency, compliance, and it's much easier to answer new questions about something that has already happened.

Conclusion: Building for the Future

Event sourcing has been a game changer for Omoik. It's helped me deliver a platform that's not just technically robust, but also genuinely useful for clients, whether they care about compliance, analytics, or just want to know what their visitors are thinking. That's the beauty! They don't need to know Omoik is built using event sourcing because it doesn't get it their way.

If you're interested in privacy-friendly, auditable platforms, or want to see how event sourcing could help your business - feel free to reach out. I'm always happy to chat about practical software solutions that make a real impact.

Want to see more technical deep dives or business case studies? Connect with me on LinkedIn or drop me a message!

Posted on: July 3rd, 2025

I streamline your business with software that actually fits the way you work.

Ready to streamline your business? Let’s chat.

Roelof Jan Elsinga

Stay up-to-date on my blog posts

* indicates required

Please select all the ways you would like to hear from Roelof Jan Elsinga:

You can unsubscribe at any time by clicking the link in the footer of our emails.