Fork me on GitHub
#cljs-dev
<
2016-04-04
>
dnolen01:04:00

@nberger: that looks good, will take a look this week and if that’s a fix, will ship a release on Friday.

nberger01:04:58

Awesome @dnolen, thanks!

bronsa08:04:41

@dnolen: tools.reader-1.0.0-beta1 is out, fixing CLJ-1606 if you want to include that in the next release aswell

dnolen12:04:16

@bronsa great, thanks

darwin13:04:47

@nberger: thanks, I will look into that, but first I have to solve another problem, even without :parallel-build enabled, 1.7.228 works for me, but upgrading to 1.8.40 causes my generated js files to have some weird UTF-8 issue, Chrome complains that the files are not properly UTF-8 encoded

darwin13:04:07

I’m working on a chrome extension[1], and the internal script loader for chrome extension js files has some additional checks there, the file looks good at first glance [1] https://github.com/binaryage/dirac

dnolen13:04:17

@darwin a git bisect would probably be useful here - I know we took some patches around UTF-8 and source maps

darwin13:04:04

@dnolen: iconv and some other command-line tools see no problem with the .js file (it has 5.4MB), compiled with :optimizations :whitespace

darwin13:04:21

I’m experimenting with other settings, thanks for the source map hint, the problem might be there

darwin17:04:37

so I have isolated my UTF-8 troubles to this commit: https://github.com/clojure/clojurescript/commit/404d6444cb6419658a7dacb343a5fed5b9451e0c so it seems the problem can be in Closure Compiler itself

darwin17:04:50

I have inspected the chrome utf-8 validation code and they use this routine to validate codepoint read from the file: https://chromium.googlesource.com/chromium/src.git/+/9f6c8e18f1233b4ca4cb84b38a1df2e5c1462dcf/base/strings/utf_string_conversion_utils.h#29

darwin17:04:25

maybe they are too strict and closure compiler has more lose notion of what is valid UTF-8, I’m no expert in this encoding stuff

darwin17:04:40

I could probably stich a small cmd-line util to run that code against my failing file and see what character is exactly causing the failure

darwin17:04:33

the comments says: "This pattern is not theoretically correct according to the Unicode standard”, the problem is that in UTF-8 compilation mode, closure compiler turns those codepoints blindly into raw UTF-8

darwin17:04:47

and that causes Chrome’s UTF-8 validation to choke

darwin17:04:53

the generated js file contains: goog.i18n.bidi.ltrChars_ = "A-Za-zÀ-ÖØ-öø-ʸ̀-֐ࠀ-῿" + "‎Ⰰ-﬜︀-﹯﻽-”; (which probably didn’t survive my copy&pasting anyways)

dnolen18:04:38

@darwin so it sounds like the new default is a problem and we should revert that default

darwin18:04:43

I don’t think, my case is rare, I have just posted my findings here for the record: http://dev.clojure.org/jira/browse/CLJS-1547?focusedCommentId=42617

darwin18:04:30

if I compiled the thing under :advanced optimizations, the bidi namespace would get DCE-ed and I wouldn’t run into the issue, I guess

darwin18:04:59

maybe we could raise it upstream somehow

dnolen18:04:02

thanks for digging in

dnolen18:04:32

yes that’s possible re: raising it upstream

darwin18:04:50

no problem, it is my bad that I’m pushing cljs to those unexplored areas like chrome extensions simple_smile

darwin18:04:26

btw. did you have a chance to look at my proposal for inlining truth? a few days back in this channel

darwin18:04:03

would it be worth creating a jira ticket for it?

darwin18:04:02

I think this could potentially lead to cljs hints being obsolete and leave this (hard) work to closure compiler

dnolen18:04:07

@darwin no I haven’t had time to look at that - feel free to make a ticket with your proposal so I don’t lose track of it

darwin18:04:35

sure, will do

darwin20:04:14

posted here http://dev.clojure.org/jira/browse/CLJS-1615, I’m open to do more investigation and polishing on it if it proves to be a promising way forward