Fork me on GitHub
#instaparse
<
2016-01-13
>
lucasbradstreet01:01:19

Hi @jamesnvc, cljs perf is definitely a bit slow. You need to be using advanced mode compile otherwise it’s incredibly slow.

lucasbradstreet01:01:24

I’m the guy who did the port

jamesnvc01:01:00

@lucasbradstreet: Cool, thanks simple_smile Good to know I’m (maybe) not just doing something crazy

lucasbradstreet01:01:14

Depending on what you’re doing, you can also serialise the parser definition and load it in directly

lucasbradstreet01:01:38

Creating the initial parser in cljs can take quite a while, but the parsing itself can be pretty acceptable

meow01:01:03

and my money was on crazy @jamesnvc

meow01:01:18

can any of that "job" be split between client and server?

meow01:01:28

say for a chat app

meow01:01:18

just send the user keystrokes to the server - do it in clj there

meow01:01:30

just brainstorming

meow01:01:45

doesn't each keystroke go to the server already - that's how you can display the fact that the user is typing

meow01:01:07

so don't do any processing on the client - do the instaparse on the server

meow01:01:34

and use yada or onyx or something to scale it

meow01:01:59

we can segregrate services on the server and compose them

meow01:01:48

compose microservices on the server and keep the client relatively stupid whenever the data is already on the server

lucasbradstreet01:01:27

I was parsing excel formulas on the client and it was good enough

lucasbradstreet01:01:41

Certainly faster than a round-trip to the server

meow01:01:06

I'll defer to @jamesnvc since I'm just blowing smoke

lucasbradstreet01:01:25

My overall experience was that creating the initial parser was very expensive, but overall parsing was OK, but that it had to be in advanced mode. All with a big chunk of “your mileage may vary”. Unfortunately I don’t have any time to work on performance any further

jamesnvc11:01:00

Cool, I was thinking of splitting it between client and server, but I will give it a shot with advanced compliation too

lucasbradstreet11:01:15

Also works. I’d measure how long it takes to do the individual parses, not just page load time - because that will be affected by creating the initial parser

meow13:01:41

@jamesnvc: we should test both and not make assumptions either way, imnsho

meow13:01:01

@lucasbradstreet: thanks for the help and suggestions - much appreciated

lucasbradstreet13:01:42

Agreed. Though you have to assume some variability in request latency when testing the other method, which is why I ultimately went with the client side approach. That said, you can have slow CPU clients too.

meow13:01:35

then we should simulate issues with both environments and various combinations/permutations

meow13:01:50

ask the #C0J20813K team how good I am at doing that

meow13:01:06

issues, oh yeah, I got issues