Fork me on GitHub
#datalevin
<
2022-01-01
>
dgr00:01:55

qq about Datalevin’s and Datahike’s memory needs. I’ve got a simple side-project that I’m looking at. It will probably run on a VPS with minimal memory (2 GB, perhaps 4 GB). How are both Datalevin and Datahike with memory pressure? I don’t have a lot of data (few hundred entities, each with maybe 50 attributes), but I want to make sure the DB doesn’t suck all memory in the system. Anyway, not sure if anybody has done any tests but general rules of thumb would be interesting to help make the decision. I’m going to crosspost this to #datahike.

denik19:01:26

anecdotal but I’ve run datalevin on 2g and 4g ram incl a clojure web server and other processes without problems

dgr18:01:53

Thanks @U050CJFRU, that’s what I was looking for.

👍 1
Huahai05:01:56

Datalevin does not use much memory. Its storage is LMDB, a native DB whose code is only a few KBs and can reside in L1 cache. The layers above storage is mostly Datascript code, which is intended to run in browser. Datalevin specific code mostly does binary encoding and so on, so I don’t think memory should be something of concern for you.

Huahai05:01:07

So far I have only benchmarked running speed, where Datalevin performs very well and is generally faster than the alternatives. I have not tested memory usage, for reason listed above. I would welcome contributions in this regard.

dgr18:01:11

So there's no additional caching, just using LMDB, and LMDB simply memory maps data (uses the OS file system cache), IIRC. Do I have that right?