Fork me on GitHub

Anyone has a favorite way to sharpen up the SQL skills? (non-Clojure specific, but I'd use honeysql as the medium) Mine have had their ups and downs as one can use different data stores or abstractions over the years... I bet there are cool corners to be discovered anyway :)


Cheers. PL/SQL is a bit outside of my personal scope, still an interesting pointer So far has caught my attention. The video format is not my jam, but I still can do it as they're bite-sized


Other than reading docs regarding all the functions/syntax included with the database, trying puzzles people put out, especially ones not obviously really geared toward SQL. For example:


Sodoku might also be a good challenge (if your database has a reasonable way to do it)


Hi, I am trying to import multiple CSVs to Postgresql DB. I need to convert values automatically to be compatible with target Postgresql table so it won’t fail as following:

ERROR: column "gcagt6" is of type date but expression is of type character varying
  Hint: You will need to rewrite or cast the expression.
  Position: 187
I am basically looking for automatic conversion like PSQL does. What would be the simple way? The only library i am using right now is next.jdbc.


I am trying to insert a string comptible with postgresql date - so i dont need any magic here. Just to export table to csv and then import it again using “sql casting” so i dont have to generate type casting into next.jdbc preparestatement


Do you know which column needs to be a date?


and don't set the type?


also maybe [:lift "param"] might do it


and maybe those need to be used together


@emccue i dont. I would have to check database schema first and this is something i am trying to avoid (maybe this is the only way? I would love to know how Dbeaver solves this)


oh wait i misread your question maybe


@U8QBZBHGD 🙂. I just want JDBC to use String “2020-01-01” and insert it into database and do not care if there is varchar, text or date. Like PSQL Copy or sql insert does.


Yeah, but lets say that i will extend protocol of string “2020-01-01” to be date. But in the second CSV it will be string / varchar so it will fail during import as it will be casted to date…


This is the use case: there will some csv files generated every day. They will be called like public.table1.csv, public.table2.csv etc. Their name determines the target database schema.table. The public.table1.csv will contains

event, date
error, 2020-01-1
warning, 2018-02-21
It’s target table has structure: • event: varchar(23) • date: varchar(10) The public.table2.csv will contains
name, birth, car
John, 2000-01-13, Renault
Martin, 1983-04-21, Peugor
It’s target table has structure: • name: varchar(255) • birth: date • car: varchar(50) I want to load every created CSV to Postgresql automatically - every day. And it fails with birth being date..


Yep. I will query information_schema for types in table and then i will use next.jdbc.types to convert Strings to according types.