Skip to content

Commit

Permalink
prepare 5.7.0 release (launchdarkly#134)
Browse files Browse the repository at this point in the history
  • Loading branch information
eli-darkly authored Jan 12, 2019
1 parent 1273ba9 commit d7a95a4
Show file tree
Hide file tree
Showing 21 changed files with 1,926 additions and 911 deletions.
16 changes: 16 additions & 0 deletions .babelrc
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
{
"env": {
"test": {
"presets": [
[
"env",
{
"targets": {
"node": "6"
}
}
]
]
}
}
}
6 changes: 5 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -55,6 +55,10 @@ Your first feature flag
});
});

Using flag data from a file
---------------------------

For testing purposes, the SDK can be made to read feature flag state from a file or files instead of connecting to LaunchDarkly. See `FileDataSource` in the [TypeScript API documentation](https://github.com/launchdarkly/node-client/blob/master/index.d.ts) for more details.

Learn more
-----------
Expand Down Expand Up @@ -84,9 +88,9 @@ About LaunchDarkly
* [JavaScript](http://docs.launchdarkly.com/docs/js-sdk-reference "LaunchDarkly JavaScript SDK")
* [PHP](http://docs.launchdarkly.com/docs/php-sdk-reference "LaunchDarkly PHP SDK")
* [Python](http://docs.launchdarkly.com/docs/python-sdk-reference "LaunchDarkly Python SDK")
* [Python Twisted](http://docs.launchdarkly.com/docs/python-twisted-sdk-reference "LaunchDarkly Python Twisted SDK")
* [Go](http://docs.launchdarkly.com/docs/go-sdk-reference "LaunchDarkly Go SDK")
* [Node.JS](http://docs.launchdarkly.com/docs/node-sdk-reference "LaunchDarkly Node SDK")
* [Electron](http://docs.launchdarkly.com/docs/electron-sdk-reference "LaunchDarkly Electron SDK")
* [.NET](http://docs.launchdarkly.com/docs/dotnet-sdk-reference "LaunchDarkly .Net SDK")
* [Ruby](http://docs.launchdarkly.com/docs/ruby-sdk-reference "LaunchDarkly Ruby SDK")
* [iOS](http://docs.launchdarkly.com/docs/ios-sdk-reference "LaunchDarkly iOS SDK")
Expand Down
119 changes: 104 additions & 15 deletions caching_store_wrapper.js
Original file line number Diff line number Diff line change
Expand Up @@ -15,9 +15,50 @@ var initializedKey = "$checkedInit";
/*
CachingStoreWrapper provides commonly needed functionality for implementations of an
SDK feature store. The underlyingStore must implement a simplified interface for
querying and updating the data store (see redis_feature_store.js for an example)
while CachingStoreWrapper adds optional caching of stored items and of the
initialized state, and ensures that asynchronous operations are serialized correctly.
querying and updating the data store, while CachingStoreWrapper adds optional caching of
stored items and of the initialized state, and ensures that asynchronous operations are
serialized correctly.
The underlyingStore object must have the following methods:
- getInternal(kind, key, callback): Queries a single item from the data store. The kind
parameter is an object with a "namespace" property that uniquely identifies the
category of data (features, segments), and the key is the unique key within that
category. It calls the callback with the resulting item as a parameter, or, if no such
item exists, null/undefined. It should not attempt to filter out any items, nor to
cache any items.
- getAllInternal(kind, callback): Queries all items in a given category from the data
store, calling the callback with an object where each key is the item's key and each
value is the item. It should not attempt to filter out any items, nor to cache any items.
- upsertInternal(kind, newItem, callback): Adds or updates a single item. If an item with
the same key already exists (in the category specified by "kind"), it should update it
only if the new item's "version" property is greater than the old one. On completion, it
should call the callback with the final state of the item, i.e. if the update succeeded
then it passes the item that was passed in, and if the update failed due to the version
check then it passes the item that is currently in the data store (this ensures that
caching works correctly). Note that deletions are implemented by upserting a placeholder
item with the property "deleted: true".
- initializedInternal(callback): Tests whether the data store contains a complete data
set, meaning that initInternal() or initOrdereInternal() has been called at least once.
In a shared data store, it should be able to detect this even if the store was
initialized by a different process, i.e. the test should be based on looking at what is
in the data store. The method does not need to worry about caching this value;
CachingStoreWrapper will only call it when necessary. Call callback with true or false.
- initInternal(allData, callback): Replaces the entire contents of the data store. This
should be done atomically (i.e. within a transaction); if that isn't possible, use
initOrderedInternal() instead. The allData parameter is an object where each key is one
of the "kind" objects, and each value is an object with the keys and values of all
items of that kind. Call callback with no parameters when done.
OR:
- initOrderedInternal(collections, callback): Replaces the entire contents of the data
store. The collections parameter is an array of objects, each of which has "kind" and
"items" properties; "items" is an array of data items. Each array should be processed
in the specified order. The store should delete any obsolete items only after writing
all of the items provided.
*/
function CachingStoreWrapper(underlyingStore, ttl) {
var cache = ttl ? new NodeCache({ stdTTL: ttl }) : null;
Expand All @@ -28,28 +69,36 @@ function CachingStoreWrapper(underlyingStore, ttl) {

this.init = function(allData, cb) {
queue.enqueue(function(cb) {
underlyingStore.initInternal(allData, function() {
// The underlying store can either implement initInternal, which receives unordered data,
// or initOrderedInternal, which receives ordered data (for implementations that cannot do
// an atomic update and therefore need to be told what order to do the operations in).
var afterInit = function() {
initialized = true;

if (cache) {
cache.del(initializedKey);
cache.flushAll();

// populate cache with initial data
for (var kindNamespace in allData) {
if (Object.hasOwnProperty.call(allData, kindNamespace)) {
var kind = dataKind[kindNamespace];
var items = allData[kindNamespace];
cache.set(allCacheKey(kind), items);
for (var key in items) {
cache.set(cacheKey(kind, key), items[key]);
}
}
}
Object.keys(allData).forEach(function(kindNamespace) {
var kind = dataKind[kindNamespace];
var items = allData[kindNamespace];
cache.set(allCacheKey(kind), items);
Object.keys(items).forEach(function(key) {
cache.set(cacheKey(kind, key), items[key]);
});
});
}

cb();
});
};

if (underlyingStore.initOrderedInternal) {
var orderedData = sortAllCollections(allData);
underlyingStore.initOrderedInternal(orderedData, afterInit);
} else {
underlyingStore.initInternal(allData, afterInit);
}
}, [], cb);
};

Expand Down Expand Up @@ -141,6 +190,46 @@ function CachingStoreWrapper(underlyingStore, ttl) {
cache.del(allCacheKey(dataKind[kindNamespace]));
}
}

// This and the next function are used by init() to provide the best ordering of items
// to write the underlying store, if the store supports the initOrderedInternal method.
function sortAllCollections(dataMap) {
var result = [];
Object.keys(dataMap).forEach(function(kindNamespace) {
var kind = dataKind[kindNamespace];
result.push({ kind: kind, items: sortCollection(kind, dataMap[kindNamespace]) });
});
var kindPriority = function(kind) {
return kind.priority === undefined ? kind.namespace.length : kind.priority
};
result.sort(function(i1, i2) {
return kindPriority(i1.kind) - kindPriority(i2.kind);
});
return result;
}

function sortCollection(kind, itemsMap) {
var itemsOut = [];
var remainingItems = new Set(Object.keys(itemsMap));
var addWithDependenciesFirst = function(key) {
if (remainingItems.has(key)) {
remainingItems.delete(key);
var item = itemsMap[key];
if (kind.getDependencyKeys) {
kind.getDependencyKeys(item).forEach(function(prereqKey) {
addWithDependenciesFirst(prereqKey);
});
}
itemsOut.push(item);
}
};
while (remainingItems.size > 0) {
// pick a random item that hasn't been updated yet
var key = remainingItems.values().next().value;
addWithDependenciesFirst(key);
}
return itemsOut;
}
}

module.exports = CachingStoreWrapper;
Expand Down
7 changes: 3 additions & 4 deletions evaluate_flag.js
Original file line number Diff line number Diff line change
Expand Up @@ -344,11 +344,10 @@ function bucketUser(user, key, attr, salt) {
idHash += "." + user.secondary;
}

hashKey = util.format("%s.%s.%s", key, salt, idHash);
hashVal = parseInt(sha1(hashKey).substring(0,15), 16);
var hashKey = util.format("%s.%s.%s", key, salt, idHash);
var hashVal = parseInt(sha1(hashKey).substring(0,15), 16);

result = hashVal / 0xFFFFFFFFFFFFFFF;
return result;
return hashVal / 0xFFFFFFFFFFFFFFF;
}

function bucketableStringValue(value) {
Expand Down
2 changes: 1 addition & 1 deletion event_processor.js
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@ function EventProcessor(sdkKey, config, errorReporter) {
function makeOutputEvent(event) {
switch (event.kind) {
case 'feature':
debug = !!event.debug;
var debug = !!event.debug;
var out = {
kind: debug ? 'debug' : 'feature',
creationDate: event.creationDate,
Expand Down
15 changes: 12 additions & 3 deletions feature_store.js
Original file line number Diff line number Diff line change
@@ -1,8 +1,17 @@
var dataKind = require('./versioned_data_kind');

// An in-memory store with an async interface.
// It's async as other implementations (e.g. the RedisFeatureStore)
// may be async, and we want to retain interface compatibility.
// The default in-memory implementation of a feature store, which holds feature flags and
// other related data received from LaunchDarkly.
//
// Other implementations of the same interface can be used by passing them in the featureStore
// property of the client configuration (that's why the interface here is async, even though
// the in-memory store doesn't do anything asynchronous - because other implementations may
// need to be async). The interface is defined by LDFeatureStore in index.d.ts. There is a
// Redis-backed implementation in RedisFeatureStore; for other options, see
// [https://docs.launchdarkly.com/v2.0/docs/using-a-persistent-feature-store].
//
// Additional implementations should use CachingStoreWrapper if possible.

var noop = function(){};
function InMemoryFeatureStore() {
var store = {allData:{}};
Expand Down
147 changes: 147 additions & 0 deletions file_data_source.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,147 @@
var fs = require('fs'),
winston = require('winston'),
yaml = require('yaml'),
dataKind = require('./versioned_data_kind');

/*
FileDataSource provides a way to use local files as a source of feature flag state, instead of
connecting to LaunchDarkly. This would typically be used in a test environment.
See documentation in index.d.ts.
*/
function FileDataSource(options) {
var paths = (options && options.paths) || [];
var autoUpdate = !!options.autoUpdate;

return config => {
var featureStore = config.featureStore;
var watchers = [];
var pendingUpdate = false;
var logger = options.logger || config.logger || defaultLogger();
var inited = false;

function defaultLogger() {
return new winston.Logger({
level: 'info',
transports: [ new (winston.transports.Console)() ]
});
}

function loadFilePromise(path, allData) {
return new Promise((resolve, reject) =>
fs.readFile(path, 'utf8', (err, data) =>
err ? reject(err) : resolve(data))
).then(data => {
var parsed = parseData(data) || {};
var addItem = (kind, item) => {
if (!allData[kind.namespace]) {
allData[kind.namespace] = {};
}
if (allData[kind.namespace][item.key]) {
throw new Error('found duplicate key: "' + item.key + '"');
} else {
allData[kind.namespace][item.key] = item;
}
}
Object.keys(parsed.flags || {}).forEach(key => {
addItem(dataKind.features, parsed.flags[key]);
});
Object.keys(parsed.flagValues || {}).forEach(key => {
addItem(dataKind.features, makeFlagWithValue(key, parsed.flagValues[key]));
});
Object.keys(parsed.segments || {}).forEach(key => {
addItem(dataKind.segments, parsed.segments[key]);
});
});
}

function loadAllPromise() {
pendingUpdate = false;
var allData = {};
var p = Promise.resolve();
for (var i = 0; i < paths.length; i++) {
(path => {
p = p.then(() => loadFilePromise(path, allData))
.catch(e => {
throw new Error('Unable to load flags: ' + e + ' [' + path + ']');
});
})(paths[i]);
}
return p.then(() => initStorePromise(allData));
}

function initStorePromise(data) {
return new Promise(resolve => featureStore.init(data, () => {
inited = true;
resolve();
}));
}

function parseData(data) {
// Every valid JSON document is also a valid YAML document (for parsers that comply
// with the spec, which this one does) so we can parse both with the same parser.
return yaml.parse(data);
}

function makeFlagWithValue(key, value) {
return {
key: key,
on: true,
fallthrough: { variation: 0 },
variations: [ value ]
};
}

function startWatching() {
var reload = () => {
loadAllPromise().then(() => {
logger && logger.warn('Reloaded flags from file data');
}).catch(() => {});
};
paths.forEach(path => {
var watcher = fs.watch(path, { persistent: false }, (event, filename) => {
if (!pendingUpdate) { // coalesce updates to avoid reloading repeatedly
pendingUpdate = true;
setTimeout(reload, 0);
}
});
watchers.push(watcher);
});
}

function stopWatching() {
watchers.forEach(w => w.close());
watchers = [];
}

var fds = {};

fds.start = fn => {
var cb = fn || (() => {});

if (autoUpdate) {
startWatching();
}

loadAllPromise().then(() => cb(), err => cb(err));
};

fds.stop = () => {
if (autoUpdate) {
stopWatching();
}
};

fds.initialized = () => {
return inited;
};

fds.close = () => {
fds.stop();
};

return fds;
}
}

module.exports = FileDataSource;
Loading

0 comments on commit d7a95a4

Please sign in to comment.