This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2016-04-04
Channels
- # beginners (31)
- # boot (24)
- # braid-chat (17)
- # cider (4)
- # cljs-dev (33)
- # cljsrn (5)
- # clojure (79)
- # clojure-austin (1)
- # clojure-poland (229)
- # clojure-russia (51)
- # clojure-uk (3)
- # clojurescript (68)
- # core-async (1)
- # core-matrix (1)
- # datomic (18)
- # editors (24)
- # emacs (7)
- # hoplon (118)
- # jobs (1)
- # jobs-discuss (1)
- # juxt (7)
- # off-topic (16)
- # om (121)
- # onyx (3)
- # parinfer (224)
- # protorepl (3)
- # re-frame (29)
- # reagent (1)
- # rethinkdb (2)
- # ring (1)
- # spacemacs (2)
- # untangled (182)
@nberger: that looks good, will take a look this week and if that’s a fix, will ship a release on Friday.
@dnolen: tools.reader-1.0.0-beta1
is out, fixing CLJ-1606 if you want to include that in the next release aswell
@nberger: thanks, I will look into that, but first I have to solve another problem, even without :parallel-build enabled, 1.7.228 works for me, but upgrading to 1.8.40 causes my generated js files to have some weird UTF-8 issue, Chrome complains that the files are not properly UTF-8 encoded
I’m working on a chrome extension[1], and the internal script loader for chrome extension js files has some additional checks there, the file looks good at first glance [1] https://github.com/binaryage/dirac
@darwin a git bisect would probably be useful here - I know we took some patches around UTF-8 and source maps
@dnolen: iconv and some other command-line tools see no problem with the .js file (it has 5.4MB), compiled with :optimizations :whitespace
I’m experimenting with other settings, thanks for the source map hint, the problem might be there
so I have isolated my UTF-8 troubles to this commit: https://github.com/clojure/clojurescript/commit/404d6444cb6419658a7dacb343a5fed5b9451e0c so it seems the problem can be in Closure Compiler itself
I have inspected the chrome utf-8 validation code and they use this routine to validate codepoint read from the file: https://chromium.googlesource.com/chromium/src.git/+/9f6c8e18f1233b4ca4cb84b38a1df2e5c1462dcf/base/strings/utf_string_conversion_utils.h#29
maybe they are too strict and closure compiler has more lose notion of what is valid UTF-8, I’m no expert in this encoding stuff
this is the method used for the check: https://chromium.googlesource.com/chromium/src.git/+/9f6c8e18f1233b4ca4cb84b38a1df2e5c1462dcf/base/strings/string_util.cc#519
I could probably stich a small cmd-line util to run that code against my failing file and see what character is exactly causing the failure
so the culprit is this: https://github.com/google/closure-library/blob/master/closure/goog/i18n/bidi.js#L202
the comments says: "This pattern is not theoretically correct according to the Unicode standard”, the problem is that in UTF-8 compilation mode, closure compiler turns those codepoints blindly into raw UTF-8
the generated js file contains: goog.i18n.bidi.ltrChars_ = "A-Za-zÀ-ÖØ-öø-ʸ̀-ࠀ-" + "Ⰰ-︀--”; (which probably didn’t survive my copy&pasting anyways)
@darwin so it sounds like the new default is a problem and we should revert that default
I don’t think, my case is rare, I have just posted my findings here for the record: http://dev.clojure.org/jira/browse/CLJS-1547?focusedCommentId=42617
if I compiled the thing under :advanced optimizations, the bidi namespace would get DCE-ed and I wouldn’t run into the issue, I guess
no problem, it is my bad that I’m pushing cljs to those unexplored areas like chrome extensions
btw. did you have a chance to look at my proposal for inlining truth? a few days back in this channel
I think this could potentially lead to cljs hints being obsolete and leave this (hard) work to closure compiler
@darwin no I haven’t had time to look at that - feel free to make a ticket with your proposal so I don’t lose track of it
posted here http://dev.clojure.org/jira/browse/CLJS-1615, I’m open to do more investigation and polishing on it if it proves to be a promising way forward