This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2020-07-31
Channels
- # announcements (2)
- # babashka (145)
- # beginners (260)
- # calva (17)
- # chlorine-clover (7)
- # clj-kondo (9)
- # cljsrn (1)
- # clojure (88)
- # clojure-dev (65)
- # clojure-europe (31)
- # clojure-france (4)
- # clojure-nl (4)
- # clojure-uk (61)
- # clojuredesign-podcast (1)
- # clojurescript (31)
- # code-reviews (1)
- # cursive (32)
- # data-science (2)
- # datascript (9)
- # datomic (39)
- # docker (3)
- # events (1)
- # figwheel (95)
- # figwheel-main (4)
- # fulcro (17)
- # kaocha (2)
- # keechma (1)
- # malli (1)
- # meander (35)
- # nrepl (4)
- # off-topic (1)
- # pathom (8)
- # re-frame (4)
- # reagent (8)
- # reitit (3)
- # releases (1)
- # remote-jobs (2)
- # shadow-cljs (182)
- # sql (30)
- # tools-deps (89)
- # xtdb (31)
That will recompute the classpath from scratch and overwrite the local cache file with the recomputed information.
You have to use it with the same aliases as you want to run anyway, not just on its own.
clojure -Sforce -A:usual:stuff:here
The cached information is per-unique-combination-of-aliases.
Oh, well, the CLI tools don't produce a target
folder full of stuff.
I assume that's the tree you are talking about?
Probably. Compiling .cljs
files to .js
?
oh hey, i tried creating src/foo.cljs and src/foo.cljc and fired up a repl and was not able to see the things in foo.cljc
I don't do any ClojureScript so I can't help with that.
What's the idiomatic way perform a bunch of steps on a map, and recording the results of each step in the map, so that the results are available to the next step? I imagine some like this:
(assoc-> {} $
:foo 1
:bar (inc (:foo $))
:baz (inc (inc (:bar $))))
It's inspired by as->
, and can be implemented in terms thereof:
(as-> {} $
(assoc $ :foo 1)
(assoc $ :bar (inc (:foo $)))
(assoc $ :baz (inc (inc (:bar $)))))
Or with reduce
:
(reduce (fn [m [k f]] (assoc m k (f m)))
{}
[[:foo (constantly 1)]
[:bar #(inc (:foo %))]
[:baz #(inc (inc (:bar %)))]])
Either solution can be used to implement an assoc->
macro. But then, looking at the as->
solution, it made me wonder if it's worth it: is the cost of introducing a new macro (`assoc->`) into the vocabulary, worth the small saving in cruft of the as->
solution? My gut says "no", and yet this seems like the kind of common use case for which there should be something more idiomatic.I'd probably prefer something with reduce
if I wanted the previous map to be available for the next step. And I might be tempted to use point-free style if I thought it helped readability, and maybe partition
so I could avoid the nested pairs...
(reduce (fn [m [k f]] (assoc m k (f m)))
{}
(partition 2 [:foo (constantly 1) :bar (comp inc :foo) :baz (comp inc inc :bar)]))
Not sure I'd bother with a macro for that @clojurians-slack100 since a function would do just fine:
(defn assoc-f [base & args]
(assert (even? (count args)))
(reduce (fn [m [k f]] (assoc m k (f m)))
base
(partition 2 args)))
(assoc-f {} :foo (constantly 1) :bar (comp inc :foo) :baz (comp inc inc :bar))
Something like that?user=> (assoc-f {} :foo (constantly 1) :bar (comp inc :foo) :baz (comp inc inc :bar))
{:foo 1, :bar 2, :baz 4}
user=>
Function or macro, though, the question remains: does that complicate reading of the code more than using a slightly messier as->
would?
I think the as->
example you gave is a lot less readable than the one line (assoc-f ...)
call above 🙂
(in general, I only use as->
inside a ->
pipeline for small expressions that don't follow the first-arg threading)
(-> (something)
(f arg2)
(as-> m (assoc m :foo (inc (:bar m))))
(g arg2 arg3))
(and if I'm using as->
at all, I prefer a less cryptic name than $
or %
or whatever)
As for macros, consider them a "last resort": always try to do things with a function first. Add a macro later just for syntactic sugar.
Unless you have a situation that absolutely requires arguments are not evaluated -- then you have to write a macro. But even then, try to implement it under the hood as a function that is perhaps passed a "thunk" (a 0-arity function).
Great answer(s), thanks @seancorfield
10-20x slower than... what @kbosompem?
Do you have development middleware doing reloading etc?
Hard to give you much guidance without more details about exactly what you have in play and what you're comparing it to... 🙂
Normally, running code from source will be slower to start up (than an uberjar) because it all has to be compiled as it is loaded (whereas the uberjar has everything pre-compiled) but then it should be the exact same speed in both cases once it is running.
Even with that compilation overhead at startup, I would be surprised by a three minute startup time unless you had a pretty big, complex app...
Sounds like it has middlware that automatically watches for code changes and reloads it for you?
I never used the ring
plugin with lein
. I just have code to start the Jetty server directly in -main
, and I can also easily start it from the REPL. So I've no idea what the plugin might be doing that's taking so much time.
getting all of your deps aot compiled and on the classpath would probably make a huge difference in startup time
I pretty much never need to reload code. I evaluate code directly into the running app so I can avoid that. And although I don't use HugSQL, I would have expected not to need a reload just because I changed the SQL files -- you should be able to ask HugSQL to redefine the functions and have those new definitions in play immediately in the running REPL...
I generally advise beginners to avoid plugins and any special middleware that tries to auto-reload code: there's a lot of "magic" going on in those sorts of tools that can mask (or even cause) all sorts of problems and make things hard to debug.
you might also check your Java version - it's also possible you're hitting the Java regression we saw around Java 1.8u202 / 11.0.1 which primarily manifests as very slow startup when using user.clj. Newer versions of those did have some improvements but really using Clojure 1.10.1 has changes to sidestep the regression.
Ah, I didn't even think about user.clj
-- something else I avoid 🙂
often in play with this kind of ring setup
I mean, the user.clj is probably the thing loading all the code in this case
I'm curious as to how much the Cognitect folks lean on user.clj
when they're developing? I think I've only ever used it once, in just one project, and that was to compile
a specific ns when a REPL started up so that a (:gen-class)
type would be available to the rest of my code.
probably varies, but Rich, Stu, and I never use it
That's what I figured. We need y'all to do more talks about your dev workflow and your use of the REPL. It seems to me that a lot of beginners go down the rabbit hole of all sorts of weird plugins and add-ons and complex, error-prone "reloaded" / auto-refresh tools 😞
well Stu did a pretty good one :)
there's not much to talk about if you keep it simple :)
"here's me not using lein or any plugins and just starting a repl"
but also, our use cases are different than most Clojure devs doing web stuff.
I dunno. I do web stuff (backend) all day, every day and my workflow is like y'all's 🙂
Stu's talks are what I refer people to, and the podcast he was on, talking about his workflow.
And Eric's "RDD" course is awesome for that too.
But, yeah, no plugins, no refresh, no watchers, no auto-reloading, no user.clj
.
And I gather not much use of CIDER/nREPL at Cognitect either, I'll warrant?
Does it have a (ns ...)
form in it, or is it just a top-level script with some development-related code in it?
OK, then not the user.clj
file that Alex was referring to.
wow thank you thank you @seancorfield and @alexmiller did the "compile" and run jetty from the repl and it was instantaneous
Same call went from 6s on average in dev mode to 1.2s which is close enough to the 800ms with a jar
My preference is to keep my dev setup as close to my production setup as possible: no plugins, no dev-only middleware, no watchers, no auto-reloading -- just a tight REPL workflow.
(although I do build an uberjar with AOT compiled code for production deployment -- but I still run the same Socket REPL in both production and local dev)
while i have both of you here one more question. I use timbre for logging and i like it but adds at least half a second to each function call so i wrapped the call in a future to "speed" things up. unfortunately now the log isn't sequential. what would you recommend?
a half second sounds unusable to me
some Clojure loggers use an agent - should be basically sequential (per concurrency constraints) due to the agent queue. not sure what timbre has
No idea why Timbre would take so long. What do you mean by "half a second to each function call"? You mean just adding logging calls?
Oh... well, possibly. We used Timbre at work extensively but never used its database appenders. Perhaps it is setting up a new connection on every call, instead of using a connection pool?
If you can use a connection pool with that database appender, I'd expect it to be much faster.
Or HikariCP.
What's the DB appender library you're using?
(defn mysql-appender
""
[db-config]
{:enabled? true
:async? false
:min-level :info
:rate-limit nil
:output-fn :inherit
:fn (fn [data] (log-message db-config data))})
(defn log-message
"Log application messages in application database."
[config data]
(let [entry
{:instant (java.sql.Timestamp. (.getTime ^java.util.Date (:instant data)))
:level (str (:level data))
:line (str (:?line data) " | " (:tid data))
:vargs (str (:vargs data))
:namespace (str (:?ns-str data))
:content (str (force (:msg_ data)))
:error (str (:?err data))
:hostname (str (force (:hostname_ data)))}]
(jdbc/insert! config :logs entry)))
Triple backtick at the start and end of code makes it a lot more readable.
:thumbsup::skin-tone-2:
(def log-config
{:level :info
:appenders {:mysql-appender (mysql-appender db)
:rolling-appender (rolling-appender {:path log-file-name :pattern :daily})}})
So, yeah, your mysql-appender
should make a connection pooled datasource from db-config
and use {:connection cpds}
instead of db-config
in the log-message
call.
(for what it's worth, next.jdbc
is the replacement for clojure.java.jdbc
and has built-in support for connection pooled datasources with both c3p0 and HikariCP)
And HugSQL supports next.jdbc
-- there's a getting started with HugSQL guide in the next.jdbc
docs.
Overall, next.jdbc
is faster than clojure.java.jdbc
in general usage...
If you have any questions about next.jdbc
(or about clojure.java.jdbc
-- I maintain both), drop into the #sql channel and ask (I'm more likely to see questions there).
What is the best way to invert a filter? Such as the following:
clojure-rte.core> (filter :accepting [{:accepting true
:a 100}
{:accepting false
:a 200}
{:accepting true
:a 300}])
({:accepting true, :a 100} {:accepting true, :a 300})
clojure-rte.core>
what I really want to do is partition the sequence by a predicate, binding the true responses to one variable and the false responses to the other.
(let [[trues falses] (partition-by-predicate some-sequence some-predicate)]
...)
I'm tempted to call filter
twice, once with the predicate and second with the complement
of the predicatethe complement of filter
is remove
, but in this case you probably want group-by
:
(map (group-by :accepting coll) [true false])
(map (group-by ...) [true false])
that's clever
another question. I'm implementing a function which mathematically returns a set of sets. So I don't care which order the elements are generated. i.e., [[1 2 3] [4 5]]
is semantically equivalent (for my program) to [[5 4][1 3 2]]
. I've implemented this forcing the results to be sets, which makes the test cases easy to write. However, when I map and filter using sets, I don't get back sets.
If I want to write my functions using just default clojure sequences, what's the correct way to write my tests?
(defn %partition-by
"Given a set of objects, return a set of subsets thereof which is a partition of
the given set. Every element in any some set has the same value under f, and
the value under f is different for any distinct subsets. f is not called
if the size of the given set is 1."
[objects f]
(if (= 1 (count objects))
#{ objects}
(set (map set (vals (group-by f objects))))))
(deftest t-%partition-by
(testing "%partition-by"
(is (= (%partition-by #{1 2 3 4 5 6 7} even?)
#{#{1 3 5 7}
#{2 4 6}}))
(is (= (%partition-by #{1 2 3 4 5 6 7 8 9} #(mod % 3))
#{#{2 5 8}
#{1 4 7}
#{3 6 9}}))))
BTW, I also discovered that (filter (complement :accepting) ...)
also works
it works, but (remove :accepting ...)
does the same thing, and that's why remove exists
I'd like to iterate over a sequence, (for example using (map f sequence)
and have the iteration function f
return a key/value pair, to generate a map, rather than a sequence.
You can just return a 2-element vector, (into {} [[:a 1] [:b 2] [:c 3]])
-> {:a 1, :b 2, :c 3}
Or to better fit your example:
(into {}
(map (fn [x]
[x (inc x)])
[1 2 3]))
=> {1 2, 2 3, 3 4}
do we have an array tabulate function? I.e., I want to create an array of size N where rather than giving explicitly N values for the array, instead I give a function which maps an integer the intended array value? Of course I can write such a function, but want to re-use what's there. I guess it would be something like this?
(defn tabulate
""
[n f]
(into [] (map f (range n))))
map-indexed
might be helpful?
@alexmiller I didn't know about map-indexed
thats sort of a cousin of zip-with-index
I guess. BTW, my guess is that (into [] (map f (range n)))
is a pretty good solution. right?
i often do (map vector (range) coll)
to end up with tuples of index and items from the collection
because i forget the order of args for f
in map-indexed
. so i just make my own tuples
I thought an expression like (some #{target} sequence)
works as a Boolean to determine whether target is an element of the sequence. It doesn't always work. what should I be using instead?
if target is nil the (some #{nil} [ 'a 'b nil 'c])
returns Boolean false
generally, I would either use a set or a map instead and avoid the linear search
@alexmiller, thats'a response to which comment?
Returns the first logical true value of (pred x) for any x in coll,
. #{nil} as a predicate can never return logical true.
Also (some #{false} ['a 'b false 'c])
returns nil
, which surprises me.
just as a general thing, don't use a set as a predicate if you are trying to match logically false values (`nil` or false
)
this is just one example of that more general statement
@dpsutton. yes I see. good explanation, thanks. So what is the correct way to search for an item in a sequence?
don't search an item for a sequence - make a keyed data structure
or use some (but be beware of the caveats above)
or combo of first / filter is done sometimes
I'm using an array, because I need O(1) access.
it is indeed linear when I'm scanning it during construction. However, after I've constructed, it is O(1)
but (some #(contains? #{nil false} %) [1 2 nil 3])
can do the search for logical false values
my current challenge is the construction phase. I don't mind that it is O(n)
So given a variable x
which has been passed to me from the user of my library, How can I know if that value is in my array?
I think we're backing into the real problem "what data structure should I use given access pattern X"
I can of course iterate over the array and test every value == x
given only that description I would say, use a map and look it up
I'm a bit shocked that the sequence api does not have membership as an operation.
it's quite intentional
because you should use a data structure that supports that question better
My philosophy is that you should choose the data structure based on what makes the most sense for the entire program. not just to make it easy to construct.
But anyway, I understand the limitation. I don't have to like it, I just have to understand it.
that seems right, but seems like you're choosing a data structure that is slow to answer the operations you are doing on it (not construction)?
for the entire program would include o(1) membership checking and you're missing that feature in your data structure
I don't need membership checking after the object has been constructed.
after it has been constructed all operations are done by array index.
The construction of my object is already exponential worst-case. So I don't mind an additional linear search.
as far as I know, all you can define is extra :clean-targets
to clean up, for arbitrary tasks I'd define a new alias that calls clean along with your other tasks
:aliases {"my_clean" ["do" "clean" "foo" ["bar" "bar-arg"]]}
(do (for [i [1 2 3]] (println i))
nil)
@p.kushwaha97 ^ same effect here, no macroI see. Can I force demand it (for debugging)? Usually to get rid of compiler optimizations you'd do something with side effects (like printing it out) and that'd be sufficient.
it's not an optimization - for isn't a loop, it's a lazy generator
there's also dorun
for forcing side effects while discarding the value (but at that point why were you using a lazy thing in the first place?)
everything in Clojure is eagerly evaluated, but you have facilities to make lazy/infinite lists
I think this solves my current problem. I'll read up more on how clojure handles lazy evaluation and dorun
. Thanks everyone!
(doseq [i [1 2 3]] (println i)) ; just for side-effects
(run! println [1 2 3]) ; map-like function for running side-effects
Actually, doseq
fits better than for
. I now understand what @noisesmith was saying about why I'm using lazy things in the first place. I'm coming from common lisp / racket so I assumed that "for" was similar to the loop macros there. Should've read docs more carefully.
Hey again. Looking into routing in a new ring app... I looked at reitit and compojure, but in the end, I kinda prefer just plain old core.match. Passing in the request method and uri (split on "/"). Is there any good reason to NOT do that? Are there security concerns that the routing libraries handle?
routing libs can give conveniences like route parameters and reverse routing (eg. "construct a route to handler x" as a markup helper), but I don't know of any big security gotchas with home grown routing
btw compojure is far from the best routing lib out there IMHO - people like it for the syntactic sugar, but there are quite a few other options
Ah maybe I just had a bad first experience. What ones do you like?
bidi and reitit are good
bidi's readme has a feature comparison matrix https://github.com/juxt/bidi
I would not expect core.match
to perform particularly well compared to dedicated routing libs -- and I'd also note that it hasn't had a functional update in about three years (and is considered "stable" which means there's no active development being done -- https://clojure.org/community/contrib_libs ). Not that "stable" is bad in anyway.
We mostly use Compojure at work, with one app using Bidi. If we were starting over, we'd probably use Bidi or Reitit for all of our apps I think.
Ah ok
Reading Bidi readme now. I didn't know about that one
I think their feature matrix gives a fair overview for comparison too - data oriented and reverse routing and clj/cljs compat are three huge ones IMHO
Reitit would probably be my first choice at this point. The Metosin folks are obsessed with performance and it works well with clojure.spec
which we use very heavily.
I'm struggling to understand the use case of reverse routing in a practical sense
(I say that from zero experience with it -- but I'd evaluate it as a first choice)
@coding.aaronp if you want to link from one page to another, reverse routing means you can link based on feature, and not a specific site map
which means that if marketing decides the routes need to be prettier or whatever, the link generation handles the change
Yeah, it's very useful if you are refactoring your routing during development.
if I was going to roll routing from scratch I would think about starting from a parser
Oh, so you might actually use it outside of the router itself? Like (static-path "foo.js") to create a route to wherever
right - that's part of why the router working from both clj and cljs is good
Well that's very interesting. Thanks!
but that also applies with back-end pre-rendering with links
For sure
@coding.aaronp To add another data point - I am using bidi
currently at work, but if I were to start a new project today, I'd definitely go for reitit
Thanks @robertfrederickwarner Why is that?
Both are data driven, which is great. When we started the project we were using Compojure, but did not like using macros. reitit
takes inspiration from bidi
in being data driven, and I think improves in terms of provided functionality, developer ergonomics. metosin
do a lot of work in this area and have several excellent tools in their open source ecosystem, and reitit
provides modules for integrating with many of them.
What's the best way to quickly iterate on Clojure macros that are used in ClojureScript? I'm having to restart my shadow-cljs process every time I make a change for it to take effect. Is there a better way? I have this in a clojure file.
(defmacro parse-file [file]
(keywordize-keys
(toml/parse-string (slurp (io/resource file)))))
And then I'm doing a (:require-macros)
in my ClojureScript file and then (def data (parse-file "example.toml"))
.
Everything is working fine except that when I edit the .toml file the changes don't appear until I restart my shadow-cljs watcher. (I know this could be fixed by making an xhr request rather than inlining the data with a macro.)hrm, just spent about half an hour banging my head against the wall -- apparently protocol methods can't accept varargs. I think it should have been mentioned somewhere here https://clojure.org/reference/protocols
so, if I want to implement my own reftype (via deftype
), and I want it to work with swap!
-- my only option is to implement clojure.lang.IAtom
, including each and every arity of IAtom
's swap
?
they allow varargs, but your implementation has to cover each arity from the original, or fail at runtime
in practice that just means one vararg impl, and a bunch of other impls that call it, most of the time
also IAtom is an interface, not a protocol, and it only has 4 arities
this isn't about protocols, and for the most part it's not even defined by clojure, the VM dispatches methods by signature, and each arity that is defined in an interface is a separate signature for dispatch
@noisesmith what do you mean "they allow varargs"? (defprotocol RefVal (swap! [this f & args]))
doesn't seem to do anything useful
AFAIU it now, it says "has method "swap!" that takes "this" and 3 args ("f", "&" and "args"), so it's not varargs
OK - I misremembered that
but that's not even relevant to IAtom which is an Interface, not a protocol
clojure.core/swap! turns arities into a call with a seq
clojure.core/swap! is varaargs
then do what clojure.core did, and make a function calling your protocol
but why do you even need a protocol if you aren't using IAtom?
Ok, let's rewrind. I wanted to have my own ref type that looks like an atom (but is not really an atom)
I think having to make your own fake swap! via your own protocol is a much hackier alternative compared to just supporting a few arities of IAtom / swap
Anyway, so I take it as a "yes" ("I have to re-implement all arities of IAtom's swap
"), right?
the point of protocols / interfaces is to be able to trade out implementations, if what you are doing is a parallel incompatible implementation, what is the benefit of a protocol?
you need to support all the arities that could meaningfully be called, none will be auto-generated
but if they never get called, then clojure and the vm dont' care that they aren't implemented
the benefit of protocol, in this particular case, would be that I could have the same user-visible API without being tied to IAtom (which my new ref type isn't)
you don't need a protocol for that
function plus deftype suffices
but I still think the easier thing is to just implement the 4 or 5 arities
swap! won't work anyway, if you don't use the proper interface, please don't go and monkey patch core
I mean, I can't stop you, but it's a bad idea
you can make a fake swap function as easily (easier) than you can make a fake atom protocol
I can make a fake swap!
, but then I'll lose the real swap!
in that namespace, I don't like that
user=> (swap! (reify clojure.lang.IAtom (swap [this f] "OK")) inc)
"OK"
yeah, all variants involving IAtom are pretty obvious (but generally involve the "all arities" thing)
the more arg in the last arity fakes varargs
user=> (swap! (reify clojure.lang.IAtom (swap [this f a b more] more)) :a :b :c :d)
(:d)
I guess clojure has to hack around how weird varargs are on methods in interop, and that's why you can't use &
in protocol signatures@seancorfield: Re this function you wrote yesterday:
clojure
(defn assoc-f [base & args]
(assert (even? (count args)))
(reduce (fn [m [k f]] (assoc m k (f m)))
base
(partition 2 args)))
Why use an assert
instead of a precondition, like this?
clojure
(defn assoc-f [base & args]
{:pre [(even? (count args))]}
(reduce (fn [m [k f]] (assoc m k (f m)))
base
(partition 2 args)))
To be honest, mostly personal preference -- I don't like the :pre
/`:post` syntax much @clojurians-slack100
I do use :pre
and :post
occasionally. In our 100k line code base at work, there are 11 :pre
conditions and just one :post
condition. There are a lot more assert
calls.
(333 assert
calls in 68 files)
Related curiosity: is there a way in Clojure to turn assert
s into nops, like Python's -O3
cmd-line arg?
Yes, if you compile your code with *assert*
set!
to falsey, then assert
becomes a no-op.
But I think that's a bad idea.
If the assert would have triggered in production but you turned it off, instead of getting a hard exception (well, an Error
, not an Exception
), that bad data continues to flow through your program potentially corrupting your data and/or producing unintended behavior.
Leiningen provides a way to do it (`:global-vars` in project.clj
). You can turn it off for a session via the Clojure CLI:
(! 1043)-> clj
Clojure 1.10.1
user=> *assert*
true
user=> ^D
(! 1044)-> clj -e '(set! *assert* false)' -r
false
user=> *assert*
false
user=>
If you're using depstar
, you can use that -e
trick to disable assertions when compiling code for an uberjar (but I don't recommend it 🙂 )
(I actually had to test that to be certain it was possible):
(! 557)-> clojure -m hf.depstar.uberjar test.jar -C -m hf.explore
Compiling hf.explore ...
Building uber jar: test.jar
Processing pom.xml for {seancorfield/depstar {:mvn/version "1.0.96"}}
(! 558)-> java -jar test.jar
Exception in thread "main" java.lang.AssertionError: Assert failed: (pos? x)
at hf.explore$foo.invoke(explore.clj:5)
at hf.explore$_main.doInvoke(explore.clj:10)
at clojure.lang.RestFn.invoke(RestFn.java:397)
at clojure.lang.AFn.applyToHelper(AFn.java:152)
at clojure.lang.RestFn.applyTo(RestFn.java:132)
at hf.explore.main(Unknown Source)
(! 559)-> clojure -e '(set! *assert* false)' -m hf.depstar.uberjar test.jar -C -m hf.explore
Compiling hf.explore ...
Building uber jar: test.jar
Processing pom.xml for {seancorfield/depstar {:mvn/version "1.0.96"}}
(! 560)-> java -jar test.jar
# runs with no assertion error
(! 561)->
I was just looking at depstar last weekend as a substitute for lein uberjar
, in a small side project I'm trying to convert to tools.cli
😬
Feel free to DM me with questions about either depstar
or tools.cli
since I maintain both. For Clojure CLI / deps.edn
questions in general, the #tools-deps channel is a great place to ask.
I'll be online a lot of the weekend (I've been off yesterday and today as well -- nice four day weekend!). We're not early risers but I'm often a night owl -- Pacific time.
(and I always have my laptop open while watching TV -- we're binge-watching Star Trek: Deep Space 9 and we're into the final season at this point)
We already did all of The Original Series, then TNG, then into DS9 🙂 All since quarantine started here in mid-March!
(and we did all seven seasons of Star Wars: The Clone Wars animated series before we switched to Star Trek)