Alex
02/19/2023, 3:16 AMkotlin-deeplearning-visualization
Alexandre Brown
02/22/2023, 1:46 AMzt
02/23/2023, 10:57 PMAlex
02/24/2023, 6:25 AMpackage <http://org.jetbrains.kotlinx.dataframe.examples.titanic.ml|org.jetbrains.kotlinx.dataframe.examples.titanic.ml>
Luc Girardin
03/03/2023, 10:54 PMdarkmoon_uk
03/22/2023, 11:06 AMspierce7
03/27/2023, 6:49 PMGiraudon
04/03/2023, 2:19 PM*Julia Beliaeva* [21 h 06]
Hi, no, it's not possible right now. You'll need to train your model elsewhere and convert it to onnx format to use with KotlinDL in Android.
Was an answer provided 3 months ago ... is it still current ? I mean : will it be a version of Kotlin DL allowing you to train a model, from scratch, within an Android phone , using local data,? because in fact, i believe .. this is the “graal”, most of us are waiting for ! then the next question is : can we write code using current kotlin-DL .. that will run calculations used in a training process ? by advance thanks a lot for your help Vinzzaleslaw
04/13/2023, 7:52 AMStefan Oltmann
04/14/2023, 4:58 AMzaleslaw
05/22/2023, 2:31 PMzaleslaw
05/22/2023, 2:33 PMspierce7
08/26/2023, 2:45 PMMarcus Cvjeticanin
08/31/2023, 12:32 PMJames Yox
09/05/2023, 1:00 AM(1, 52, 52, 3, 85)
then parse the bounding boxes... I can create a tensor from the data of that shape but when I try to grab the info out It doesn't seem to be a valid representation of bounding boxes... I could be missing many things though since this is my first foray into trying to use anything beyond a super high level library for ML so I could be missing something very obvious.
Model Documentation: https://github.com/onnx/models/tree/main/vision/object_detection_segmentation/yolov4Peter
10/02/2023, 6:20 AMSaher Al-Sous
10/18/2023, 3:08 PMSpring Boot
, I loaded the model successfully, and created the needed tensor, but I can't make a call to get the result from the model because of this error:
Caused by: org.tensorflow.exceptions.TFInvalidArgumentException: Expects arg[0] to be float but uint8 is providedI checked the model signature and it was like this:
Signature for "serving_default":
Method: "tensorflow/serving/predict"
Inputs:
"input_1": dtype=DT_FLOAT, shape=(-1, 299, 299, 3)
Outputs:
"dense_3": dtype=DT_FLOAT, shape=(-1, 41)
Signature for "__saved_model_init_op":
Outputs:
"__saved_model_init_op": dtype=DT_INVALID, shape=()
my tensor details are DT_UINT8 tensor with shape [299, 299, 3]
.
When I changed my tensor data type into float like this:
val imageShape = TFloat32.tensorOf(runner.fetch(decodeImage).run()[0].shape())
val reshape = tf.reshape(
decodeImage,
tf.array(
-1.0f,
imageShape[0].getFloat(),
imageShape[1].getFloat(),
imageShape[2].getFloat())
)
I got this error:
org.tensorflow.exceptions.TFInvalidArgumentException: Value for attr 'Tshape' of float is not in the list of allowed values: int32, int64if someone is curious how I loaded the model, created the tensor and called it, here is the code below Loading the model in `TFServices`:
fun model(): SavedModelBundle {
return SavedModelBundle
.loader("/home/***/src/main/resources/pd/")
.withRunOptions(RunOptions.getDefaultInstance())
.load()
}
Building the Tensor and calling the model
val graph = Graph()
val session = Session(graph)
val tf = Ops.create(graph)
val fileName = tf.constant("/home/***/src/main/resources/keyframe_1294.jpg")
val readFile = tf.io.readFile(fileName)
val runner = session.runner()
val decodingOptions = DecodeJpeg.channels(3)
val decodeImage = tf.image.decodeJpeg(readFile.contents(), decodingOptions)
val imageShape = runner.fetch(decodeImage).run()[0].shape()
val reshape = tf.reshape(
decodeImage,
tf.array(
-1,
imageShape.asArray()[0],
imageShape.asArray()[1],
imageShape.asArray()[2])
)
val tensor = runner.fetch(reshape).run()[0]
val inputMap = mutableMapOf("input_tensor" to tensor)
println(tensor.shape())
println(tensor.dataType())
println(tensor.asRawTensor())
val result = tfService.model().function("serving_default").call(inputMap)
and i used this dependency:
implementation("org.tensorflow:tensorflow-core-platform:0.5.0")
Then i changed the whole code, and used the Kotlin Tensorflow dependencies
implementation("org.jetbrains.kotlinx:kotlin-deeplearning-api:0.5.2")
implementation("org.jetbrains.kotlinx:kotlin-deeplearning-tensorflow:0.5.2")
I loaded the model:
fun myModel(): SavedModel {
return SavedModel.load("/home/***/src/main/resources/pd/")
}
and called for the prediction:
val file = File("/home/***/src/main/resources/keyframe_1294.jpg")
val byteArray = ImageIO.read(file)
val floatArray = ImageConverter.toRawFloatArray(byteArray)
val myResult = tfService.myModel().predictSoftly(floatArray, "dense_3")
println(myResult)
but i got this error:
Caused by: org.tensorflow.TensorFlowException: Op type not registered 'DisableCopyOnRead' in binary running on My Computer. Make sure the Op and Kernel are registered in the binary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib, accessing (e.g.)is there a fix for this or a ready example I can learn from? all i want to do is to use my model that i generated using Tensorflow 2 in spring boot application. thank youshould be done before importing the graph, as contrib ops are lazily registered when the module is first accessed.tf.contrib.resampler
Akram Bensalem
11/08/2023, 11:48 AM%use dataframe
To something similar to this:
%use dataframe as df
Aaron Vontell
11/26/2023, 3:58 PMJavokhir Savriev
12/11/2023, 3:27 AMStefan Oltmann
01/10/2024, 8:37 PMJavokhir Savriev
03/15/2024, 5:23 AMadambrangenberg
03/20/2024, 9:21 PMColton Idle
04/10/2024, 11:40 PMJavokhir Savriev
04/22/2024, 3:15 AMMichal Harakal
05/31/2024, 12:32 PMSlackbot
10/03/2024, 7:08 AMSanjay
11/12/2024, 5:15 AM[Alpha]-0mega-
11/18/2024, 9:07 AMdarkmoon_uk
03/08/2025, 9:50 PM