Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
39 commits
Select commit Hold shift + click to select a range
84ac9a0
tests: wait before close because reasons
pgte May 23, 2017
5b18536
added missing dev dep
pgte May 23, 2017
293183a
saving the log
pgte May 25, 2017
94192c1
sync WIP
pgte May 25, 2017
0b93a7a
Merge branch 'master' into sync
pgte May 25, 2017
db68e49
Merge branch 'master' into sync
pgte May 26, 2017
a26fced
WIP
pgte May 27, 2017
1c97bc6
seams to be working now
pgte May 27, 2017
0039307
fixed some callback calling and async stuffs
pgte May 27, 2017
25d85dc
for existing parents
pgte May 27, 2017
f031f37
fixed cids of merged head
pgte May 27, 2017
bfab565
sync tests: extended waiting time
pgte May 27, 2017
1f0a973
test timeouts
pgte May 28, 2017
5bffdf1
package lock
pgte May 28, 2017
56d3e1a
sync reacts immediately
pgte May 28, 2017
e9ae2af
tests fixed
pgte May 28, 2017
97bf905
corrected conflict management
pgte May 29, 2017
021ac42
testing booting a third node
pgte May 29, 2017
717c3f7
removed sauce
pgte May 29, 2017
78313e1
trying to fix webrtc om ubuntu
pgte May 29, 2017
9a6c900
trying to fix webrtc om ubuntu
pgte May 29, 2017
56db6d6
voodoo black magic
pgte May 29, 2017
9834f85
voodoo black magic
pgte May 29, 2017
6cf309f
voodoo black magic
pgte May 29, 2017
e93516a
segfault handler for tests
pgte May 29, 2017
9ca1082
docs: options.heads -> options.log
pgte May 29, 2017
73e0f3d
browser tests
pgte May 30, 2017
89cf01e
travis CI
pgte May 30, 2017
43873a6
fixed arg check
pgte May 30, 2017
0520c81
removed console.log
pgte May 30, 2017
363f8a2
travis CI
pgte May 31, 2017
56aa429
travis CI
pgte May 31, 2017
9c2e9d8
travis CI
pgte May 31, 2017
68a1e33
leveldown conformance tests: related fixes
pgte Jun 7, 2017
8f249c5
sync tests need sync
pgte Jun 7, 2017
049a190
abstract leveldown tests finish
pgte Jun 8, 2017
b3fd313
block -> dag API
pgte Jun 8, 2017
3bfd63b
sync tests
pgte Jun 8, 2017
10c17dd
tests: removed unnecessary stuff
pgte Jun 8, 2017
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 12 additions & 0 deletions .aegir.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
'use strict'

module.exports = {
karma: {
files: [{
pattern: 'test/fixtures/**/*.js',
watched: false,
served: true,
included: false
}]
}
}
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -1 +1,2 @@
node_modules
crash.log
11 changes: 8 additions & 3 deletions .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,17 +5,22 @@ matrix:
include:
- node_js: 6
env:
- SAUCE=true
- SAUCE=false
- CXX=g++-4.8

# Make sure we have new NPM.
before_install:
- npm install -g npm
# before_install:
# - npm install -g npm

before_script:
- export DISPLAY=:99.0
- sh -e /etc/init.d/xvfb start

script:
- npm test

addons:
firefox: 'latest'
apt:
sources:
- ubuntu-toolchain-r-test
Expand Down
43 changes: 42 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,9 +26,50 @@ Arguments:

* `options` (object, defaults to [this](src/default-options.js)): with the following keys:
* `ipfsOptions` (object). [IPFS options object](https://github.com/ipfs/js-ipfs#advanced-options-when-creating-an-ipfs-node).
* `heads` (LevelDown-compatible database that stores the heads)
* `log` (LevelDown-compatible database that stores the log)
* `sync` (boolean, defaults to `false`): EXPERIMENTAL! syncs data between nodes
* `ipfs` (IPFS object): an IPFS object instance. If you already can provide an IPFS object, pass it in here.


# Default arguments

You can create a constructor that curries some default arguments by using `IPFSLevel.defaults(options)` like this:

```js
const ipfsLevel = IPFSLevel.defaults({
log: someLevelDownLogDatabase
})
```

## Sync

TODO: Explain sync


## With Levelup


This default options feature may be useful if you want to pass a constructor into which you'll have no saying about the options, like on the Levelup constructor:

```js
const LevelUp = require('levleup')
const Memdown = require('memdown') // any leveldown db will do for caching log entries
const const IPFSLevel = require('ipfs-level').defaults({
log: Memdown('some-partition-name') // log database should be scoped to partition
})

const db = LevelUp({ db: IPFSLevel })
// now you have a levelup db you can use
```

# Internals

The internals are documented [here](docs/INTERNALS.md).

# Test and debug

This package uses [debug](https://github.com/visionmedia/debug#readme), so you can activate debug messages by setting the environment variable `DEBUG` to `ipfs-level:*`

# License

MIT
Expand Down
55 changes: 55 additions & 0 deletions docs/INTERNALS.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,55 @@
# ipfs-level Internals

# Local put

When a node does a `db.put(key, value, callback)`, this is what happens inside that node:

* the key and value are encoded into a single document. We'll call this the 'kv-doc'
* This kv-doc is written onto IPFS (using the IPFS DAG API)
* IPFS gives you a Content ID (CID) in return, which uniquely identifies this document.
* The node retrieves the latest log entry (the latest HEAD) from the local log
* A new log entry is created. This entry contains:
* `parent`: the CID from the latest HEAD
* `key`: the kv-doc key
* `cid`: the CID the kv-doc
* `clock`: the updated vector clock, where the entry for the current node was incremented
* This new log entry is written into the IPFS, using the DAG API
* IPFS gives you a CID for that log entry
* The node uses the CID as a key to save the log entry into the log
* The node saves the log entry into the log using a key derived from the kv-doc key. This way it's easy to retrieve the latest log entry for a given value.
* The node saves the log entry under the key `HEAD`, to be able to later retrieve it

# Remote update

When a node gets a remote log update, the message contains the latest log entry (the remote HEAD), which contains:
* `parent`: the remote parent log entre
* `key`: the update this log entry pertains to
* `cid`: the CID of the remote kv-doc that originated this entry
* `clock`: the remote vector clock that this log entry generated

Upon receiving this message, the node goes thtough these procedures:

* Full log retrieval:
* If the node already contains the parent log entry of the remote log entry, nothing is done
* Otherwise, it retrieves the parent log entry and stores it in the log
* Repeats these steps until all log entries are here
* Log sorting:
* Collects all the new log entries
* Sorts them in temporal order
* Log entry processing: starting from the earliest log entry, for each entry:
* retrieves the latest local log entry for that key
* compares the remote and local vector clock:
* if the remote vector clock precedes the local one: do nothing
* if the local vector clock precedes the remote one:
* point the latest entry for that key the new log entry
* if the local vector clock is concurrent with the remote one:
* here we have to deterministically pick one of the log entries as the winning one.
* pick the log entry with the highest CID
* if the remote log entry wins, point the latest entry for that key to this new entry.
* HEAD update
* compare both the remote and the local head.
* if the local head precedes the remote one, point the new HEAD to the remote HEAD
* if the remote head precedes the local one, do nothing
* if both entries are concurrent, pick the one with the highest CID and set HEAD to that one.
* HEAD broadcast
* Whenever there is a change in the HEAD pointer, use IPFS pubsub to broadcast that change to all interested nodes.
22 changes: 19 additions & 3 deletions package.json
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,10 @@
"main": "src/index.js",
"scripts": {
"lint": "standard",
"test": "npm run lint && mocha --timeout=10000"
"test": "npm run compile:tests && npm run test:browser",
"test:browser": "aegir-test browser --dom",
"test:node": "mocha test/leveldown.js --timeout=20000 && mocha test/iterator.js --timeout=20000 && mocha test/sync.js --timeout=20000",
"compile:tests": "browserify test/abstract-leveldown.js -o test/abstract-leveldown-generated.spec.js --debug"
},
"repository": {
"type": "git",
Expand All @@ -25,15 +28,28 @@
},
"homepage": "https://github.com/pgte/ipfs-level#readme",
"devDependencies": {
"aegir": "^11.0.2",
"browserify": "^14.4.0",
"chai": "^3.5.0",
"dirty-chai": "^1.2.2",
"memdown": "^1.2.4",
"mocha": "^3.4.1",
"standard": "^10.0.2"
"rimraf": "^2.6.1",
"standard": "^10.0.2",
"tape": "^4.6.3"
},
"dependencies": {
"abstract-leveldown": "^2.6.1",
"async": "^2.4.1",
"backoff": "^2.5.0",
"debug": "^2.6.8",
"deep-assign": "^2.0.0",
"ipfs": "^0.23.1"
"hyperdiff": "^2.0.2",
"ipfs": "^0.24.0",
"lodash.clonedeep": "^4.5.0",
"vectorclock": "0.0.0"
},
"browser": {
"./test/utils/create-repo-node.js": "./test/utils/create-repo-browser.js"
}
}
24 changes: 24 additions & 0 deletions src/decoding.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
'use strict'

function decode (str) {
if (Buffer.isBuffer(str)) {
str = str.toString()
}
if (typeof str === 'object') {
return str
}
return JSON.parse(str)
}

module.exports = function decoding (callback) {
if (typeof callback !== 'function') {
throw new Error('callback is not a function')
}
return (err, str) => {
if (err && err.message === 'NotFound') {
callback(null, undefined)
} else {
callback(err, str && decode(str))
}
}
}
7 changes: 0 additions & 7 deletions src/encode.js
Original file line number Diff line number Diff line change
Expand Up @@ -6,10 +6,3 @@ exports.kv = (key, value, options) => {
value: value
}
}

exports.deleted = (key) => {
return {
key: key,
deleted: true
}
}
10 changes: 8 additions & 2 deletions src/index.js
Original file line number Diff line number Diff line change
@@ -1,5 +1,11 @@
'use strict'

const IPFSLeveldown = require('./ipfs-leveldown')
const merge = require('deep-assign')

module.exports = (partition, options) => new IPFSLeveldown(partition, options)
const IPFSLevel = require('./ipfs-level')

exports = module.exports = (partition, options) => new IPFSLevel(partition, options)
exports.defaults = (defaultOptions) => (partition, options) => {
const opts = merge({}, defaultOptions, options || {})
return new IPFSLevel(partition, opts)
}
Loading