This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2017-03-20
Channels
- # arachne (4)
- # bangalore-clj (1)
- # beginners (38)
- # boot (182)
- # cider (21)
- # cljs-dev (9)
- # clojars (5)
- # clojure (229)
- # clojure-austin (1)
- # clojure-berlin (1)
- # clojure-czech (3)
- # clojure-dusseldorf (3)
- # clojure-ireland (5)
- # clojure-italy (4)
- # clojure-russia (33)
- # clojure-spec (73)
- # clojure-taiwan (6)
- # clojure-uk (22)
- # clojure-ukraine (1)
- # clojurescript (80)
- # core-async (26)
- # cursive (3)
- # datascript (20)
- # datomic (9)
- # defnpodcast (8)
- # editors (4)
- # emacs (7)
- # garden (41)
- # hoplon (2)
- # java (1)
- # lambdaisland (2)
- # lein-figwheel (1)
- # leiningen (5)
- # luminus (4)
- # lumo (36)
- # off-topic (4)
- # om (21)
- # onyx (1)
- # pedestal (33)
- # re-frame (33)
- # ring-swagger (70)
- # spacemacs (26)
- # specter (7)
- # sql (6)
- # timbre (2)
- # untangled (12)
- # vim (3)
- # yada (1)
For example, the number of random version 4 UUIDs which need to be generated in order to have a 50% probability of at least one collision is 2.71 quintillion This number is equivalent to generating 1 billion UUIDs per second for about 85 years, and a file containing this many UUIDs, at 16 bytes per UUID, would be about 45 exabytes, many times larger than the largest databases currently in existence, which are on the order of hundreds of petabytes
you can't guarantee the uniqueness without some kind of centralised co-ordination
50% of one collision in 2.71 quintillion samples
a userful metric would be, what is the largest N such that even fif you generated N uuids, pr of having a collision is < change of dying from asteriod attack
Putting a probability number on the chances of being hit by a space rock is difficult, since the events are so rare. Still, Tulane University earth sciences professor Stephen A. Nelson published a paper in 2014 that made the effort. He put the lifetime odds of dying from a local meteorite, asteroid, or comet impact at 1 in 1,600,000.
the equation is in the wiki article
actually, i can't find the article now
but a more useful metric would be considering the chance that a centralised UUID scheme results in collisions thanks to human/technological error
"dying from asteroid attack" is not a seriously useful anything
i think you have to propose what alternative exists that doesn't have a non-zero probability of issuing a duplicate
@qqq or, explain what you're doing that has a good chance of producing entity ids every microsecond consistently for 285 years straight in the same datascript db
with a solid use case, it's more likely that other people can:
- agree there is a problem to be solved
- solve the problem without causing new problems
even twitter "only" receives 6000 tweets per second, based on a quick google
and datascript on a single machine cannot physically transact at the rate you're talking, as highlighted by the benchmarks from the 11th