Fork me on GitHub

@tengstrand there's a section in the about an integration hiccup with Cursive that I think I know how to fix


An alternative way of adding the component is by specifying it as an :extra-deps (the development/src directory still has to be specified as a path):

 :aliases  {:dev {:extra-paths ["development/src"]
                  :extra-deps {poly/user {:local/root "components/user"}}
            :test {:extra-paths ["components/user/test"]}
If you use Cursive as an IDE, this will not work correctly, and the problem is that Cursive doesn't treat components/user/src as a source directory in the IDE (it will not be marked as green). This is also why we use the first form in this example.


Using the Import module from existing sources, and re-importing the example folder, you can tell Cursive to Automatically import transitive :local/root dependencies


This will mark user as a module and colour its src and test folders properly


That sounds great!


I assume you haven't tried this yet?


No, not yet.


Okay, I'll proceed like this then, to see if I run into any troubles later on

👍 2

But I've used this method in a different style monorepo I regularly work on and it works fine


I will try this later today. If you want, you can create an issue and describe how you do it in more detail.


Feel free to use the images if you need some for the readme (although I use a light colour scheme)

👍 2

I checked it now in Cursive but I haven’t had luck with test files. Since via local/root one can only depend on the sources, even though the test namespaces are highlighted correctly, they are not in the classpath hence they cannot be run from REPL. They also do not resolve internal dependencies.


I guess one workaround could be including the sources via local/root but tests via the old way, but I would not prefer to mix two at the same time rather stick with one way.


I commented back on the ticket, test files need to be extra-pathed regardless of the editor - and that's how the readme currently suggests in both cases


I see, then maybe the documentation should have not said. I’m not a fan of mixing things up. What was the idea there @tengstrand?


Full support from Cursive would of course be the best, so that we only have to add a single :local/rootthe same way as for the other projects. My idea was that it’s better to add a single line for the test path and a single line for the src as :local/rootwhere the latter could include the resources and its libraries for you via each brick, so that we don’t have to add them separately in ./deps.edn . So if you add a library dependency or adds a resource path to a component, it will automatically be detected. That was the idea (if it works of course!).


What I'm trying to point out that Cursive has the same level of support as all the other editors for this


it's tools.deps that needs the test path in addition to the local root


I think you are right. The reason we can leave out the test path in the other projects is that the built in test runner (via the testcommand) relies on :local/root even for the test paths, which tools.deps doesn’t.


Oh I see


I get the point regarding the use you explained @tengstrand. I cannot remember the exact reason that we said Cursive was not supporting it. I have a weak memory that it could be something related to adding/removing namespaces and files to src and resources directories of the components. I guess those were not getting included automatically in the classpath. I need to check that though before saying this is okay to use with Cursive.

👍 2

Please let me know if you do find an issue there


My understanding was that by default Cursive didn't support it (and perhaps did not have that auto-import option a while back?). Since we don't use Cursive at work, our :dev alias uses :extra-deps and :local/root for bricks and our :test alias uses :extra-paths and I'm fine with that since it's common for :test aliases to have :extra-paths in every project.


I have tried it again with using :local/root for :dev alias and :extra-paths for the :test alias. I’ve added each component and base as modules. I can start a REPL from within Cursive and it seems like it has the correct classpath. However, Cursive does not resolve references to namespaces from other components. I do not know if I did something wrong but this basically means it does not work this way. @U08BJGV6E, when you tried, where you able to reference namespaces from other components and Cursive resolved those (meaning that you can CMD+click them and navigate and it’s not a warning)?


@U2BDZ9JG3 I tried this with 2 dummy components and I did get this error, until... I added the other component dep to my consuming component: top-level deps.edn:

{:aliases  {:dev {:extra-paths ["development/src"]
                  :extra-deps {org.clojure/clojure {:mvn/version "1.10.3"}
                               org.clojure/tools.deps.alpha {:mvn/version "0.12.1003"}
                               poly/user {:local/root "components/user"}
                               poly/foo {:local/root "components/foo"}}}

            :test {:extra-paths ["components/user/test"

            :poly {:main-opts ["-m" "polylith.clj.core.poly-cli.core"]
                   :extra-deps {polyfy/polylith
                                {:git/url   ""
                                 :sha       "adb8617af2df33361e9eb5715693b5c8a08e0b43"
                                 :deps/root "projects/poly"}}}}}
user component:
{:paths ["src" "resources"]
 :deps {poly/foo {:local/root "../../components/foo"}}
 :aliases {:test {:extra-paths ["test"]
                  :extra-deps {}}}}
foo component: just the default out of create component name:foo


(ns se.example.user.core
  (:require [ :as foo]))
navigation, name resolution both work


But components should not specify their interdependencies, right? That breaks the swappability aspect of Polylith.


So, based on that, Cursive is still broken since it should be able to find/import foo purely based on the :dev alias in the workspace deps.edn file.


Oh, they shouldn't? I wasn't aware of that


You are perfectly correct @U04V70XH6! Components should not depend on each other, thats one of the most important rule of Polylith. Based on this, I think we’ll need to wait until Cursive brings support for Polylith type of projects.


We can be more clear about that in the documentation, that component shouldn’t depend on other components. Which concrete components to use is decided by each project and components should only know about interfaces.


Maybe more general, a brick shouldn’t depend on another brick in its deps.edn file. This way we include the bases as well.

👍 2

Okay, so it's Projects that bring all the dependencies together, then, I'm assuming?


Yes, that’s how it works.


I see there's an already with Cursive, I'll try to add some more detail to this, perhaps it helps Colin add support for it


yep, that's where I'll try to add more detail

👍 2

Yes, sorry, you pointed to that already!


Apologies for all the noise yesterday


The monorepo I work on has inter-module dependencies so that's why I'm seeing a correct behaviour there (it isn't a poly project)


One more question, a bit unrelated: in poly you only ever start a repl for the development project, is that correct?


Yes, that’s how you normally work with the code. And then you can use profiles to mimic other projects/deployables if e.g. an interface has more than one “implementation” (component) so that e.g. the interface c is implemented by c1 used in project p1and c2 used in project p2. Polylith encourages you to work from the development project and setup profiles to mimic the different projects (one profile will include c1 and another will include c2in this case).


You can have two different REPL’s specified in this case if it’s important to mimic the production environment as it is, but normally it’s enough to stick to the “dev” profile.


How does polylith work with tools.namespace's reload functions? Oh, and if you have c implemented by c1 and c2, can you then still have a single development profile referencing both of those? I would expect namespace clashes.


Exactly, you need to choose one of them in each project.


You would have two profiles and two REPLs (and why does anyone use tools.namespace?)


That whole "reload" thing is just a workaround for a poor REPL workflow, in my opinion. My REPLs run for weeks without by sort of reload/refresh.


@tengstrand thanks for confirming that

👍 2

@U04V70XH6 I'm not saying it's a superior workflow, but it's quite common in my experience. I find reload-all useful for when I want to have a clean slate and ensure that my repl program state is as close as possible to when it will run starting from source in CI (or deployed somewhere) but I don't want to restart my repl


There is nothing prevents using tools.namespace’s reload function with Polylith. I’ve used it and didn’t have any issues. But I agree with @U04V70XH6 about it.

Matt Ielusic23:09:09

I have a bit of a shower thought: I see the word "brick" used here a lot, but the Polylith website doesn't define the term like it defines "Component" or "Workspace." Is "brick" just a catchall term derived from "lego brick", used to refer to Components, Bases, maybe Projects in some contexts, and quite possibly a whole Workspace in others?


It's introduced in the poly tool's README and at one point it explicitly says "Components and bases are referred to as bricks (we will soon explain what a base is)."


(but, yes, it probably should mention that in the overview gitbook)


From the introduction of the poly tool readme: > Polylith introduces the architectural concept of “service level building blocks”, which can be combined like LEGO bricks to build our services and systems. Polylith’s LEGO-like bricks are easy to reason about, test, refactor, and reuse. They allow us to work with all our code in one place for maximum productivity, using a single REPL > The bricks can easily be put together to form different kinds of deployable artifacts, like services, tools and libraries, in the same way we put together LEGO when we were kids! Not surprisingly, it's just as simple and fun! > To give you an idea of what that can look like, take a quick look at the bricks and libraries that we use to build the Polylith tool (which is itself a Polylith workspace, represented by the poly column in the first diagram):

Matt Ielusic00:09:55

Ah, I see, thanks.


The gitbook is very high level and really doesn't talk about a lot of the stuff at all. The tool's readme is huge -- it would be much easier to digest as a gitbook or up on cljdoc! 🙂

Matt Ielusic00:09:31

Yeah, I spent an afternoon last month just going through the whole readme

Matt Ielusic00:09:41

It's a great tutorial


Yes, maybe it’s time to move to gitbook or something similar. I’ve had that idea for a long time now. I also have the idea to structure the documentation slightly differently, when the new shell command comes out, so that the new way of working gets it focus. Good idea to be more clear in the overall documentation about what a brick is (component/base). It also feels that we could have a small example in the overall doc and talk a little bit around it, for people that comes new to Polylith.


@tengstrand Since you make releases of the poly tool to Clojars you could just leverage and break the readme up into multiple markdown files in doc -- but it looks like @U2BDZ9JG3 makes those releases and I'm a bit surprised by the group/artifact ID used...


If you did go to I don't know how it would deal with the monorepo to be honest. So maybe gitbook would be easier since you're already using that.


Furkan always made the releases before (he set up the CircleCi script) but I have started to do it now and I released the latest one (it still shows that @U2BDZ9JG3 did it though). I’ve started to migrate the documentation to gitbook, and it will definitely be easier to use the documentation when that is released! @U04V70XH6


Yeah CI uses my Clojars account’s key, thats why it shows me on Clojars 😂. And yes, Clojars Group ID (polylith) does not match to GitHub organization (polyfy) and its not fully qualified reverse domain. The reason is that this group is used for the old leiningen version and at that time I wasn’t aware the best practices 😂. I think at some point @tengstrand contacted them to change it but I guess they are immutable. If you have any suggestions to fix it, feel free @U04V70XH6.


@tengstrand If you are thinking about revamping the documentation structure, perhaps you may want to look at this documentation pattern that's been used by several high profile projects (the video is well worth watching) :


Okay, cool! Thanks for the link.


Now I have watched it. It was really good! I don’t think I will restructure the documentation right now, but I will definitely have these four categories in mind! @U2C6SPLDS

🙌 2