Issue
I am trying to build a recommendation system using TFlite on Android . I have created the model successfully and have also run inference on the same , which is running pretty good. But the problem lies within trying to integrate the application with the model . I am trying to integrate the model to the official application provided by the tensorflow team .I have done all the steps asked by them , but the problem I faced was regarding the input/output towards the model . I faced the error saying :
Cannot convert between a TensorFlowLite tensor with type FLOAT32 and a Java object of type [I (which is compatible with the TensorFlowLite type INT32).
I am not able to understand what this error means nor there are any documentation regarding the same. The code present for the purpose of input and output in the official code are as follows : This is the main code wherein the input and outputs are defined :
/** Given a list of selected items, and returns the recommendation results. */
@WorkerThread
public synchronized List<Result> recommend(List<MovieItem> selectedMovies) {
Object[] inputs = preprocess(selectedMovies);
// Run inference.
float[] outputIds = new float[config.outputLength];
float[] confidences = new float[config.outputLength];
Map<Integer, Object> outputs = new HashMap<>();
outputs.put(config.outputIdsIndex, outputIds);
outputs.put(config.outputScoresIndex, confidences);
tflite.runForMultipleInputsOutputs(inputs, outputs);
return postprocess(outputIds, confidences, selectedMovies);
}
This defines the preprocessing part :
int[] preprocessIds(List<MovieItem> selectedMovies, int length) {
int[] inputIds = new int[length];
Arrays.fill(inputIds, config.pad); // Fill inputIds with the default.
int i = 0;
for (MovieItem item : selectedMovies) {
if (i >= inputIds.length) {
break;
}
inputIds[i] = item.id;
++i;
}
return inputIds;
}
int[] preprocessGenres(List<MovieItem> selectedMovies, int length) {
// Fill inputGenres.
int[] inputGenres = new int[length];
Arrays.fill(inputGenres, config.unknownGenre); // Fill inputGenres with the default.
int i = 0;
for (MovieItem item : selectedMovies) {
if (i >= inputGenres.length) {
break;
}
for (String genre : item.genres) {
if (i >= inputGenres.length) {
break;
}
inputGenres[i] = genres.containsKey(genre) ? genres.get(genre) : config.unknownGenre;
++i;
}
}
return inputGenres;
}
/** Given a list of selected items, preprocess to get tflite input. */
@WorkerThread
synchronized Object[] preprocess(List<MovieItem> selectedMovies) {
List<Object> inputs = new ArrayList<>();
// Sort features.
List<Feature> sortedFeatures = new ArrayList<>(config.inputs);
Collections.sort(sortedFeatures, (Feature a, Feature b) -> Integer.compare(a.index, b.index));
for (Feature feature : sortedFeatures) {
if (Config.FEATURE_MOVIE.equals(feature.name)) {
inputs.add(preprocessIds(selectedMovies, feature.inputLength));
} else if (Config.FEATURE_GENRE.equals(feature.name)) {
inputs.add(preprocessGenres(selectedMovies, feature.inputLength));
} else {
Log.e(TAG, String.format("Invalid feature: %s", feature.name));
}
}
return inputs.toArray();
}
What are the needed changes to get the recommendation's working ?
Edit : I was able to solve the above issue . I found that the input required for genres were of type float , so a float array of genres was passed as input and the issue was resolved . However, a new error occurred saying :
java.lang.IllegalStateException: Internal error: Unexpected failure when preparing tensor allocations: tensorflow/lite/kernels/reshape.cc:66 num_input_elements != num_output_elements (10 != 32) Node number 0 (RESHAPE) failed to prepare.
The issue is related to the input and the output elements not matching . I am not able to figure out the solution for the same .
The link to the model.tflite can be found here :
https://drive.google.com/file/d/1CZxlJRqLZmwrsmgcA8lBz6XCh2KG3lWa/view?usp=sharing
Solution
This question is a result of utter confusion and bit of being mislead by the colab file . The colab file present in the tensorflow/examples/recommendation guides into creating a tensorflow lite model with three inputs which are genre, rating and movie-id , but the android application present in the same repository implements code to consider only two inputs which are movie id and genre . The colab can be found at :
As per guidance given by @Farmaker, I visualized my model and the model present inside the android application present in the tensorflow-recommendation repo .Here how it looks like :
My-model :
Google's-model :
I without considering the code used for interpreting the .tflite model inside android blindly followed the Google colab file wherein it just mentions the collaboration of the .tflite with android without any extra coded required .
Solution :
First error :
Cannot convert between a TensorFlowLite tensor with type FLOAT32 and a Java object of type [I (which is compatible with the TensorFlowLite type INT32).
Since the input object needs to be in a particular format of how the model requires , the second input should have been in the float type but I was passing it int as the parameter . This lead to the particular error which was resolved after passing the parameter's in a particular order which is the required format for the .tflite model .
Second Error :
java.lang.IllegalStateException: Internal error: Unexpected failure when preparing tensor allocations: tensorflow/lite/kernels/reshape.cc:66 num_input_elements != num_output_elements (10 != 32) Node number 0 (RESHAPE) failed to prepare.
This error was due to the genres parameter required the float array to be of a size of 32 , but I was only providing the model with a float array of size 10 which on observing the error can be easily said so . This was resolved by me by passing the genre float array of size 32 .
There were several required changes to be made in the android repository of recommendation system which were made by me and the code works fine .
Answered By - Karunesh Palekar
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.