Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
21 changes: 21 additions & 0 deletions lib/unleash/backup_file_writer.rb
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
require 'unleash/configuration'

module Unleash
class BackupFileWriter
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

extracted from toggle_fetcher so that we can share it between polling and streaming

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bikeshedding: This might be better served / mode idiomatic by using MixIns in Ruby.

Here is an example:

https://blog.appsignal.com/2021/01/13/using-mixins-and-modules-in-your-ruby-on-rails-application.html

or

Using include with Modules (Mixins):
The include keyword is used to add a module's methods as instance methods to a class. This is a common way to achieve "mixin" behavior, sharing functionality across unrelated classes.

module Greetable
  def greet_instance
    puts "Hello from the instance!"
  end
end

class MyOtherClass
  include Greetable # Adds greet_instance as an instance method
end

my_instance = MyOtherClass.new
my_instance.greet_instance # Output: Hello from the instance!

Copy link
Contributor Author

@kwasniew kwasniew Aug 20, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Questions I asked myself:

  • do I need to access object state from a mixin method?
  • do I want save file operation in fetcher API and streaming processor API?
  • is toggle fetcher and streaming event processor a backup file writer (IS-A) or does it use (HAS-A) a backup file writer?
  • is this code readable for somone maintaining 5 other SDKs on a daily basis?

It lead me to the current solution.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is this code readable for somone maintaining 5 other SDKs on a daily basis

❤️

I think this solution is fine honestly. If I were going to fuss, it'd be in the direction of making this explicitly stateless rather than the tacit connection that a MixIn brings here. I vote we leave it alone, I think it's a nice improvement to the code structure

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Your points make sense! Go for it!

def self.save!(toggle_data)
Unleash.logger.debug "Will save toggles to disk now"

backup_file = Unleash.configuration.backup_file
backup_file_tmp = "#{backup_file}.tmp"

File.open(backup_file_tmp, "w") do |file|
file.write(toggle_data)
end
File.rename(backup_file_tmp, backup_file)
rescue StandardError => e
# This is not really the end of the world. Swallowing the exception.
Unleash.logger.error "Unable to save backup file. Exception thrown #{e.class}:'#{e}'"
Unleash.logger.error "stacktrace: #{e.backtrace}"
end
end
end
4 changes: 3 additions & 1 deletion lib/unleash/streaming_event_processor.rb
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
require 'json'
require 'unleash/backup_file_writer'

module Unleash
class StreamingEventProcessor
Expand Down Expand Up @@ -41,7 +42,8 @@ def handle_connected_event(event)
def handle_updated_event(event)
handle_delta_event(event.data)

# TODO: update backup file
full_state = @toggle_engine.get_state
Unleash::BackupFileWriter.save!(full_state)
rescue JSON::ParserError => e
Unleash.logger.error "Unable to parse JSON from streaming event data. Exception thrown #{e.class}: '#{e}'"
Unleash.logger.debug "stacktrace: #{e.backtrace}"
Expand Down
21 changes: 2 additions & 19 deletions lib/unleash/toggle_fetcher.rb
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
require 'unleash/configuration'
require 'unleash/bootstrap/handler'
require 'unleash/backup_file_writer'
require 'net/http'
require 'json'
require 'yggdrasil_engine'
Expand Down Expand Up @@ -54,25 +55,7 @@ def fetch
# always synchronize with the local cache when fetching:
update_engine_state!(response.body)

save! response.body
end

def save!(toggle_data)
Unleash.logger.debug "Will save toggles to disk now"

backup_file = Unleash.configuration.backup_file
backup_file_tmp = "#{backup_file}.tmp"

self.toggle_lock.synchronize do
File.open(backup_file_tmp, "w") do |file|
file.write(toggle_data)
end
File.rename(backup_file_tmp, backup_file)
end
rescue StandardError => e
# This is not really the end of the world. Swallowing the exception.
Unleash.logger.error "Unable to save backup file. Exception thrown #{e.class}:'#{e}'"
Unleash.logger.error "stacktrace: #{e.backtrace}"
Unleash::BackupFileWriter.save! response.body
end

private
Expand Down
81 changes: 81 additions & 0 deletions spec/unleash/streaming_event_processor_spec.rb
Original file line number Diff line number Diff line change
@@ -0,0 +1,81 @@
RSpec.describe Unleash::StreamingEventProcessor do
let(:engine) { YggdrasilEngine.new }
let(:processor) { Unleash::StreamingEventProcessor.new(engine) }
let(:backup_file) { Unleash.configuration.backup_file }

before do
Unleash.configure do |config|
config.url = 'http://test-url/'
config.app_name = 'test-app'
end
Unleash.logger = Unleash.configuration.logger
end

after do
File.delete(backup_file) if File.exist?(backup_file)
end

class TestEvent
attr_reader :type, :data

def initialize(type, data)
@type = type
@data = data
end
end

def feature_event(name, enabled = true)
{
"events": [{
"type": "feature-updated",
"eventId": 1,
"feature": {
"name": name,
"enabled": enabled,
"strategies": [{ "name": "default" }]
}
}]
}.to_json
end

def backup_contains_feature?(name)
return false unless File.exist?(backup_file)

parsed = JSON.parse(File.read(backup_file))
feature_names = parsed['features'].map { |f| f['name'] }
feature_names.include?(name)
end

describe '#process_event' do
it 'processes valid events and saves full engine state' do
event = TestEvent.new('unleash-updated', feature_event('test-feature'))
processor.process_event(event)

expect(engine.enabled?('test-feature', {})).to eq(true)
expect(backup_contains_feature?('test-feature')).to eq(true)
end

it 'ignores unknown event types' do
event = TestEvent.new('unknown-event', feature_event('test-feature'))
processor.process_event(event)

expect(File.exist?(backup_file)).to eq(false)
expect(engine.enabled?('test-feature', {})).to be_falsy
end

it 'saves full engine state, not partial event data' do
processor.process_event(TestEvent.new('unleash-updated', feature_event('first-feature', true)))
processor.process_event(TestEvent.new('unleash-updated', feature_event('second-feature', false)))

expect(backup_contains_feature?('first-feature')).to eq(true)
expect(backup_contains_feature?('second-feature')).to eq(true)
end

it 'handles invalid JSON gracefully without creating backup' do
event = TestEvent.new('unleash-updated', 'invalid json')

expect { processor.process_event(event) }.not_to raise_error
expect(File.exist?(backup_file)).to eq(false)
end
end
end
Loading