When I try to deploy after compose with 1.29-beta-...
# orm-help
m
When I try to deploy after compose with 1.29-beta-6 docker tag it throws an error
"exception":"java.lang.RuntimeException: Encountered unknown SQL type timestamptz with column createdAt.
Before that, I had 1.26 docker tag and it was working without any problem.
h
Hi @Martí Crespí Can you please send me a quick reproduction for this? Beta sometime have bugs. That is why we deploy beta on demo server so that we can catch bugs
m
Hi @Harshit, it's only a table with a timestampz field called createdAt that I understand it throws this error..
h
are you using introspection?
m
No, I'm using my datamodel.prisma generated with prisma-alpha7 that generate wellformed arrays, and I'm trying to deploying with 1.29-beta-6 docker tag.
h
ok, are you able to reproduce in the latest stable version: 1.28.3?
m
Do you mean the deploy? I need a higher version to use introspection because last stable version doesn't generate the relations as arrays.
h
I mean the prisma server version
m
But I would like to try prisma admin, and it must to be 1.29 or higher.. That's why I have tried with this version!
h
you can use https://padmin.netlify.com for that while we fix this
please confirm if this works in 1.28.3
m
Yes, now I recompose my docker image and let you know
Copy code
java.lang.RuntimeException: Encountered unknown SQL type timestamptz with column createdAt. IntrospectedColumn(createdAt,timestamptz,null,false)
at scala.sys.package$.error(package.scala:26)
at com.prisma.deploy.connector.jdbc.DatabaseInspectorBase.$anonfun$getTable$6(DatabaseInspectorBase.scala:57)
at scala.Option.getOrElse(Option.scala:121)
at com.prisma.deploy.connector.jdbc.DatabaseInspectorBase.$anonfun$getTable$5(DatabaseInspectorBase.scala:57)
at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:233)
at scala.collection.Iterator.foreach(Iterator.scala:937)
at scala.collection.Iterator.foreach$(Iterator.scala:937)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1425)
at scala.collection.IterableLike.foreach(IterableLike.scala:70)
at scala.collection.IterableLike.foreach$(IterableLike.scala:69)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at scala.collection.TraversableLike.map(TraversableLike.scala:233)
at scala.collection.TraversableLike.map$(TraversableLike.scala:226)
at scala.collection.AbstractTraversable.map(Traversable.scala:104)
at com.prisma.deploy.connector.jdbc.DatabaseInspectorBase.$anonfun$getTable$4(DatabaseInspectorBase.scala:54)
at slick.dbio.DBIOAction.$anonfun$map$1(DBIOAction.scala:43)
at slick.basic.BasicBackend$DatabaseDef.$anonfun$runInContextInline$1(BasicBackend.scala:171)
at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:303)
at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:37)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:60)
at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:55)
at akka.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:91)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:12)
at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:81)
at akka.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:91)
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:40)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:44)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Same error with 1.28.3!
Tried again with 1.26 and it works perfectly @Harshit!
h
Cc @marcus or @do4gr
m
If you want I can open an issue in github.
h
Yes, that would be awesome if you have a repro
m
It's the same repro as always, my billing project that is present in all my issues on prisma repo.
I will open it in a few minutes. Thanks @Harshit!