Yes, MobileNetV2 is supported along with SSDLite.
@librecomputer
I am having some issues very similar to what @charlie was having above:
MESA: error: get_param:40: get-param (1f) failed! -22 (Invalid argument)
MESA: error: get_param:40: get-param (20) failed! -22 (Invalid argument)
MESA: error: get_param:40: get-param (21) failed! -22 (Invalid argument)
MESA: error: get_param:40: get-param (22) failed! -22 (Invalid argument)
MESA: error: get_param:40: get-param (23) failed! -22 (Invalid argument)
We need at least 1 NN core to do anything useful.
Aborted
I started getting these errors after deleting and rebuilding the main mesa branch here: https://gitlab.freedesktop.org/mesa/mesa.git as directed by the link you provided. I get these errors even with the MobileNetV1 model.
If go back to the teflon-tp branch from Tomeu, I have no problems with MobileNetV1 but it doesnât work with V2 or SSDLite, I am getting a segmentation fault. I have tried a few other branchâs like teflon and teflon-ci and those yield the same mesa errors as above. What am I missing?
hi, @charlie
I meet same ERROR when i load the delegate, do you solve this problem?
We are preparing a demo at Embedded World and will update the guide next week.
Updated on 2024-04-13
Now that librecomputer updated their official kernel, the following patch is not necessary.
===
Hi, I overcame the error after patching etnaviv kernel module like this:
git clone
https://github.com/libre-computer-project/libretech-linux.git
-b linux-6.1.y-lc --single-branch --depth=1
cd libretech-linux
patch -p1 < etnaviv_kernel_module_6.1.83.patch
sudo cp include/uapi/drm/etnaviv_drm.h /lib/modules/6.1.83-14793-g05e363bdd9a7/build/include/uapi/
cd drivers/gpu/drm/etnaviv
make
sudo rmmod etnaviv.ko
sudo insmod etnaviv.ko
The etnaviv_kernel_module_6.1.83.patch file was made by applying Tomeuâs patch and my additional modification. Please look into the patch if it has no vulnerability before you patch and rebuild the module.
We have updated the instructions according to the changes necessary. Please update to the latest kernel. We will update the test examples later.
Hi, @charlie
Thanks your patch info, i can success inference the model without error,
but how long your inference time in ms, i got 195.975 ms to inference mobilenet_v1_1.0_224_quant.tflite, it seem abnormal and slow compare with Tomeu data
best,
This is my result after upgrading to librecomputerâs official kernel 6.1.85-15205, which is similar to when I applied my patch:
Loading external delegate from ../mesa/build/src/gallium/targets/teflon/libteflon.so with args: {}
0.866667: military uniform
0.031373: Windsor tie
0.015686: mortarboard
0.007843: bow tie
0.007843: academic gown
time: 10.566ms
Loading external delegate took about 25 seconds. Iâve been testing on Debian 12.
@librecomputer @charlie Have you tried running object detection using the SSDLite MobileDet model? It runs but is very slow, like 500ms. Once it loads the teflon delegate, I get this message: âINFO: Created TensorFlow Lite XNNPACK delegate for CPUâ. I donât believe it should be doing this but I canât figure out how to disable it. I get great results when running MobileNetV1 classification, like 6ms
If I run the SSDlite model without the delegate, the inference time is only 200ms and I still get the same info message about the XNNpack delegate.
It should be around 33ms. Which branch are you using?
I am using the teflon-staging branch. I did get a few errors while compiling mesa but none that prevented it from finishing. I can go back and re-compile it and post those here.
I also tried the teflon-ci branch and that did not result in any build errors but also yielded in the 500ms inference time.
Here are the errors I get while compiling the teflon-staging branch:
[981/1121] Compiling C object src/gall...naviv/libetnaviv.a.p/etnaviv_ml_nn.c.o
In function 'create_coefficients_bo',
inlined from 'etna_ml_compile_operation_nn' at ../src/gallium/drivers/etnaviv/etnaviv_ml_nn.c:1354:32:
../src/gallium/drivers/etnaviv/etnaviv_ml_nn.c:1326:24: warning: 'zrl_bits' may be used uninitialized [-Wmaybe-uninitialized]
1326 | actual_size = write_core_6(subgraph, map, core, operation, zrl_bits);
| ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
../src/gallium/drivers/etnaviv/etnaviv_ml_nn.c: In function 'etna_ml_compile_operation_nn':
../src/gallium/drivers/etnaviv/etnaviv_ml_nn.c:1304:13: note: 'zrl_bits' was declared here
1304 | unsigned zrl_bits;
| ^~~~~~~~
In function 'create_coefficients_bo',
inlined from 'etna_ml_compile_operation_nn' at ../src/gallium/drivers/etnaviv/etnaviv_ml_nn.c:1354:32:
../src/gallium/drivers/etnaviv/etnaviv_ml_nn.c:1317:4: warning: 'best_compressed_size_aligned' may be used uninitialized [-Wmaybe-uninitialized]
1317 | memset(map, 0, bo_size);
| ^~~~~~~~~~~~~~~~~~~~~~~
../src/gallium/drivers/etnaviv/etnaviv_ml_nn.c: In function 'etna_ml_compile_operation_nn':
../src/gallium/drivers/etnaviv/etnaviv_ml_nn.c:1221:13: note: 'best_compressed_size_aligned' was declared here
1221 | unsigned best_compressed_size_aligned;
| ^~~~~~~~~~~~~~~~~~~~~~~~~~~~
[1121/1121] Generating src/gallium/tar...v_etnaviv_dri.so with a custom command
Those look like warnings and not errors. It doesnât run?
It does run, just with the warnings at compiling. SSDlite mobiledet runs without any errors or warnings, just very slow. Again, MobileNetV1 works great.
Can you provide more information? Like amount of time to complete, the model filename, and any other logs related to your run.
Using the teflon-staging branch, here are my results.
Model - mobilenet_v1_1.0_224_quant.tflite, Inference time = 6.263ms
Model - ssdlite_mobiledet_coco_qat_postprocess.tflite, Inference time = 528.775ms
I am not sure how to enable debugging logs for tflite, not sure if that would be helpful.
Here is an example output:
Loading external delegate from /home/johnktm/teflon-staging/mesa/build/src/gallium/targets/teflon/libteflon.so with args: {}
INFO: Created TensorFlow Lite XNNPACK delegate for CPU.
time: 528.775ms
I am using the classification.py script but I am not printing any of the output data, just the inferencing time.
Versions:
Python - 3.11.2
Meson - 1.2.3
tflite - 2.14.0
Kernel - 6.1.85-15205
Can you also try one more thing? Compile mesa 24.1 instead of teflon-staging?
I get pretty much the same results with the mesa 24.1, which is odd as MobileNetV1 should take 15ms with that version, based upon the documentation. It is still running at about 7ms. SSDlite-MobileDet still takes 500ms. I did find out how to atleast turn on the teflon debugging info. Here is an example output:
Loading external delegate from /home/johnktm/mesa/build/src/gallium/targets/teflon/libteflon.so with args: {}
Teflon delegate: loaded etnaviv driver
teflon: compiling graph: 333 tensors 108 operations
idx scale zp has_data size
=======================================
0 0.007812 80 no 1x320x320x3
1 0.007812 80 yes 2034x4x0x0
2 0.007812 80 yes 3x0x0x0
3 0.007812 80 yes 4x0x0x0
4 0.007812 80 yes 3x0x0x0
5 0.000028 0 yes 546x0x0x0
6 0.000031 0 yes 24x0x0x0
7 0.000021 0 yes 546x0x0x0
8 0.000035 0 yes 24x0x0x0
9 0.000028 0 yes 546x0x0x0
10 0.000043 0 yes 24x0x0x0
11 0.000030 0 yes 546x0x0x0
12 0.000050 0 yes 24x0x0x0
13 0.000037 0 yes 546x0x0x0
14 0.000054 0 yes 24x0x0x0
15 0.000216 0 yes 384x0x0x0
16 0.000321 0 yes 120x0x0x0
17 0.000305 0 yes 120x0x0x0
18 0.000227 0 yes 120x0x0x0
19 0.000106 0 yes 120x0x0x0
20 0.000067 0 yes 273x0x0x0
21 0.000137 0 yes 12x0x0x0
22 0.000598 0 yes 96x0x0x0
23 0.000634 0 yes 96x0x0x0
24 0.000259 0 yes 96x0x0x0
25 0.000166 0 yes 96x0x0x0
26 0.000776 0 yes 72x0x0x0
27 0.000605 0 yes 72x0x0x0
28 0.000284 0 yes 72x0x0x0
29 0.000173 0 yes 72x0x0x0
30 0.000581 0 yes 40x0x0x0
31 0.000489 0 yes 40x0x0x0
32 0.000377 0 yes 40x0x0x0
33 0.000340 0 yes 40x0x0x0
34 0.000590 0 yes 16x0x0x0
35 0.000468 0 yes 16x0x0x0
36 0.000301 0 yes 16x0x0x0
37 0.000477 0 yes 16x0x0x0
38 0.000672 0 yes 16x0x0x0
39 0.000397 0 yes 16x0x0x0
40 0.000137 0 yes 8x0x0x0
41 0.000335 0 yes 960x0x0x0
42 0.000308 0 yes 960x0x0x0
43 0.000349 0 yes 480x0x0x0
44 0.000266 0 yes 480x0x0x0
45 0.000334 0 yes 960x0x0x0
46 0.000240 0 yes 960x0x0x0
47 0.000270 0 yes 768x0x0x0
48 0.000344 0 yes 768x0x0x0
49 0.000529 0 yes 768x0x0x0
50 0.000573 0 yes 768x0x0x0
51 0.000377 0 yes 768x0x0x0
52 0.000583 0 yes 768x0x0x0
53 0.000470 0 yes 768x0x0x0
54 0.000481 0 yes 768x0x0x0
55 0.000349 0 yes 576x0x0x0
56 0.000391 0 yes 576x0x0x0
57 0.000384 0 yes 960x0x0x0
58 0.000294 0 yes 960x0x0x0
59 0.000563 0 yes 576x0x0x0
60 0.000547 0 yes 576x0x0x0
61 0.000279 0 yes 320x0x0x0
62 0.000639 0 yes 320x0x0x0
63 0.000151 0 yes 288x0x0x0
64 0.000144 0 yes 288x0x0x0
65 0.000229 0 yes 160x0x0x0
66 0.000233 0 yes 160x0x0x0
67 0.000227 0 yes 160x0x0x0
68 0.000280 0 yes 128x0x0x0
69 0.000497 0 yes 64x0x0x0
70 0.000443 0 yes 128x0x0x0
71 0.000662 0 yes 64x0x0x0
72 0.000329 0 yes 128x0x0x0
73 0.000108 0 yes 32x0x0x0
74 0.000237 0 yes 64x0x0x0
75 0.000187 0 yes 128x0x0x0
76 0.000595 0 yes 128x0x0x0
77 0.000163 0 yes 256x0x0x0
78 0.000487 0 yes 128x0x0x0
79 0.000222 0 yes 256x0x0x0
80 0.000559 0 yes 256x0x0x0
81 0.000201 0 yes 512x0x0x0
82 0.000049 0 yes 64x0x0x0
83 0.000058 0 yes 128x0x0x0
84 0.000047 0 yes 128x0x0x0
85 0.000213 0 yes 256x0x0x0
86 0.000394 0 yes 128x0x0x0
87 0.000269 0 yes 128x0x0x0
88 0.000832 0 yes 256x0x0x0
89 0.000498 0 yes 256x0x0x0
90 0.000851 0 yes 256x0x0x0
91 0.000667 0 yes 256x0x0x0
92 0.000868 0 yes 512x0x0x0
93 0.000813 0 yes 512x0x0x0
94 0.002097 0 yes 384x0x0x0
95 0.001877 0 yes 384x0x0x0
96 0.000687 0 yes 96x0x0x0
97 0.001071 0 yes 96x0x0x0
98 0.013768 86 yes 32x3x3x3
99 0.023528 0 no 1x160x160x32
100 0.005831 5b yes 8x1x1x32
101 0.023528 0 no 1x160x160x8
102 0.016877 78 yes 16x3x3x8
103 0.023528 0 no 1x160x160x16
104 0.028541 7e yes 16x1x1x16
105 0.317024 7f no 1x160x160x16
106 0.001039 87 yes 128x3x3x16
107 0.023528 0 no 1x80x80x128
108 0.020279 72 yes 16x1x1x128
109 0.399250 93 no 1x80x80x16
110 0.001658 8f yes 64x3x3x16
111 0.023528 0 no 1x80x80x64
112 0.012792 75 yes 16x1x1x64
113 0.185369 89 no 1x80x80x16
114 0.364902 86 no 1x80x80x16
115 0.001215 89 yes 128x3x3x16
116 0.023528 0 no 1x80x80x128
117 0.019874 88 yes 16x1x1x128
118 0.295380 9a no 1x80x80x16
119 0.422037 84 no 1x80x80x16
120 0.001178 8a yes 64x3x3x16
121 0.023528 0 no 1x80x80x64
122 0.025068 6a yes 16x1x1x64
123 0.265060 7c no 1x80x80x16
124 0.438641 81 no 1x80x80x16
125 0.000639 8c yes 128x5x5x16
126 0.023528 0 no 1x40x40x128
127 0.014455 7c yes 40x1x1x128
128 0.213016 7d no 1x40x40x40
129 0.001065 83 yes 160x3x3x40
130 0.023528 0 no 1x40x40x160
131 0.016034 68 yes 40x1x1x160
132 0.215959 89 no 1x40x40x40
133 0.244231 84 no 1x40x40x40
134 0.000954 8b yes 160x3x3x40
135 0.023528 0 no 1x40x40x160
136 0.020769 84 yes 40x1x1x160
137 0.203404 73 no 1x40x40x40
138 0.283416 81 no 1x40x40x40
139 0.000809 7e yes 160x3x3x40
140 0.023528 0 no 1x40x40x160
141 0.024674 82 yes 40x1x1x160
142 0.252078 85 no 1x40x40x40
143 0.373888 86 no 1x40x40x40
144 0.001709 84 yes 320x1x1x40
145 0.023528 0 no 1x40x40x320
146 0.011841 80 yes 1x3x3x320
147 0.023528 0 no 1x20x20x320
148 0.007343 7d yes 72x1x1x320
149 0.171151 7a no 1x20x20x72
150 0.003193 70 yes 576x1x1x72
151 0.023528 0 no 1x20x20x576
152 0.023923 90 yes 1x3x3x576
153 0.023528 0 no 1x20x20x576
154 0.012066 8a yes 72x1x1x576
155 0.125240 79 no 1x20x20x72
156 0.164588 7d no 1x20x20x72
157 0.000875 80 yes 288x3x3x72
158 0.023528 0 no 1x20x20x288
159 0.025725 8a yes 72x1x1x288
160 0.149522 7e no 1x20x20x72
161 0.204098 81 no 1x20x20x72
162 0.000739 85 yes 288x3x3x72
163 0.023528 0 no 1x20x20x288
164 0.032994 69 yes 72x1x1x288
165 0.234474 6d no 1x20x20x72
166 0.279827 80 no 1x20x20x72
167 0.001396 81 yes 576x1x1x72
168 0.023528 0 no 1x20x20x576
169 0.014841 71 yes 1x5x5x576
170 0.023528 0 no 1x20x20x576
171 0.007045 8a yes 96x1x1x576
172 0.116532 7e no 1x20x20x96
173 0.004128 89 yes 768x1x1x96
174 0.023528 0 no 1x20x20x768
175 0.019962 84 yes 1x5x5x768
176 0.023528 0 no 1x20x20x768
177 0.011003 7f yes 96x1x1x768
178 0.089881 71 no 1x20x20x96
179 0.134499 78 no 1x20x20x96
180 0.004334 a4 yes 768x1x1x96
181 0.023528 0 no 1x20x20x768
182 0.016030 7f yes 1x3x3x768
183 0.023528 0 no 1x20x20x768
184 0.026939 8c yes 96x1x1x768
185 0.167958 6e no 1x20x20x96
186 0.220141 7b no 1x20x20x96
187 0.002603 77 yes 768x1x1x96
188 0.023528 0 no 1x20x20x768
189 0.022502 81 yes 1x3x3x768
190 0.023528 0 no 1x20x20x768
191 0.025405 85 yes 96x1x1x768
192 0.337801 86 no 1x20x20x96
193 0.362627 8b no 1x20x20x96
194 0.002955 87 yes 1x3x3x96
195 0.023528 0 no 1x20x20x96
196 0.005824 b1 yes 12x1x1x96
197 0.073659 be no 1x20x20x12
198 0.073659 be no 1x1200x1x4
199 0.001893 7c yes 1x3x3x96
200 0.023528 0 no 1x20x20x96
201 0.002867 79 yes 273x1x1x96
202 0.046695 e5 no 1x20x20x273
203 0.046695 e5 no 1x1200x91x0
204 0.000950 7e yes 768x1x1x96
205 0.023528 0 no 1x20x20x768
206 0.011480 6b yes 1x5x5x768
207 0.023528 0 no 1x10x10x768
208 0.004486 8e yes 120x1x1x768
209 0.094560 8e no 1x10x10x120
210 0.002536 71 yes 960x1x1x120
211 0.023528 0 no 1x10x10x960
212 0.014212 76 yes 1x3x3x960
213 0.023528 0 no 1x10x10x960
214 0.009651 92 yes 120x1x1x960
215 0.078762 8a no 1x10x10x120
216 0.093230 85 no 1x10x10x120
217 0.002857 85 yes 480x1x1x120
218 0.023528 0 no 1x10x10x480
219 0.014843 75 yes 1x5x5x480
220 0.022253 0 no 1x10x10x480
221 0.013717 5f yes 120x1x1x480
222 0.075735 76 no 1x10x10x120
223 0.100618 7f no 1x10x10x120
224 0.003058 8d yes 960x1x1x120
225 0.023528 0 no 1x10x10x960
226 0.014256 79 yes 1x3x3x960
227 0.021790 0 no 1x10x10x960
228 0.014721 77 yes 120x1x1x960
229 0.112479 8f no 1x10x10x120
230 0.159471 8f no 1x10x10x120
231 0.001844 73 yes 960x1x1x120
232 0.023528 0 no 1x10x10x960
233 0.016324 66 yes 1x5x5x960
234 0.023528 0 no 1x10x10x960
235 0.009160 8f yes 384x1x1x960
236 0.100989 84 no 1x10x10x384
237 0.018587 7f yes 1x3x3x384
238 0.023528 0 no 1x10x10x384
239 0.002302 7f yes 24x1x1x384
240 0.034358 8f no 1x10x10x24
241 0.034358 8f no 1x600x1x4
242 0.073659 be no 1x600x1x4
243 0.020767 8c yes 1x3x3x384
244 0.023528 0 no 1x10x10x384
245 0.001580 88 yes 546x1x1x384
246 0.048552 dc no 1x10x10x546
247 0.048552 dc no 1x600x91x0
248 0.046695 e5 no 1x600x91x0
249 0.002111 7d yes 256x1x1x384
250 0.023528 0 no 1x10x10x256
251 0.023756 81 yes 1x3x3x256
252 0.023528 0 no 1x5x5x256
253 0.008563 84 yes 512x1x1x256
254 0.023528 0 no 1x5x5x512
255 0.034563 76 yes 1x3x3x512
256 0.023528 0 no 1x5x5x512
257 0.002108 75 yes 24x1x1x512
258 0.028891 7f no 1x5x5x24
259 0.028891 7f no 1x150x1x4
260 0.073659 be no 1x150x1x4
261 0.036906 9f yes 1x3x3x512
262 0.023528 0 no 1x5x5x512
263 0.001266 9f yes 546x1x1x512
264 0.040888 d9 no 1x5x5x546
265 0.040888 d9 no 1x150x91x0
266 0.046695 e5 no 1x150x91x0
267 0.002016 7e yes 128x1x1x512
268 0.022842 0 no 1x5x5x128
269 0.021330 a2 yes 1x3x3x128
270 0.023494 0 no 1x3x3x128
271 0.009458 76 yes 256x1x1x128
272 0.023528 0 no 1x3x3x256
273 0.028351 8f yes 1x3x3x256
274 0.023528 0 no 1x3x3x256
275 0.001809 7e yes 24x1x1x256
276 0.027995 7b no 1x3x3x24
277 0.027995 7b no 1x54x1x4
278 0.073659 be no 1x54x1x4
279 0.036184 b3 yes 1x3x3x256
280 0.023528 0 no 1x3x3x256
281 0.001170 97 yes 546x1x1x256
282 0.039151 db no 1x3x3x546
283 0.039151 db no 1x54x91x0
284 0.046695 e5 no 1x54x91x0
285 0.002459 7f yes 128x1x1x256
286 0.018352 0 no 1x3x3x128
287 0.032409 87 yes 1x3x3x128
288 0.012468 0 no 1x2x2x128
289 0.013053 7c yes 256x1x1x128
290 0.022235 0 no 1x2x2x256
291 0.022392 88 yes 1x3x3x256
292 0.023528 0 no 1x2x2x256
293 0.001482 83 yes 24x1x1x256
294 0.020045 89 no 1x2x2x24
295 0.020045 89 no 1x24x1x4
296 0.073659 be no 1x24x1x4
297 0.037434 96 yes 1x3x3x256
298 0.023528 0 no 1x2x2x256
299 0.000906 74 yes 546x1x1x256
300 0.030463 e7 no 1x2x2x546
301 0.030463 e7 no 1x24x91x0
302 0.046695 e5 no 1x24x91x0
303 0.002209 b1 yes 64x1x1x256
304 0.014257 0 no 1x2x2x64
305 0.016612 62 yes 1x3x3x64
306 0.011400 0 no 1x1x1x64
307 0.016383 9d yes 128x1x1x64
308 0.014088 0 no 1x1x1x128
309 0.019107 7b yes 1x3x3x128
310 0.023493 0 no 1x1x1x128
311 0.001335 7d yes 24x1x1x128
312 0.019957 95 no 1x1x1x24
313 0.019957 95 no 1x6x1x4
314 0.073659 be no 1x6x1x4
315 0.073659 be no 1x2034x1x4
316 0.073659 be no 1x2034x4x0
317 0.073659 be no 1x2034x4x0
318 0.027944 89 yes 1x3x3x128
319 0.023528 0 no 1x1x1x128
320 0.001203 8a yes 546x1x1x128
321 0.032394 dd no 1x1x1x546
322 0.032394 dd no 1x6x91x0
323 0.046695 e5 no 1x6x91x0
324 0.046695 e5 no 1x2034x91x0
325 0.003906 0 no 1x2034x91x0
326 0.003906 0 no 1x2034x91x0
327 0.003906 0 no 0x0x0x0
328 0.003906 0 no 0x0x0x0
329 0.003906 0 no 0x0x0x0
330 0.003906 0 no 0x0x0x0
0 0.000000 0 no 0x0x0x0
0 0.000000 0 no 0x0x0x0
idx type in out operation type-specific
================================================================================================
0 CONV 0 99 w: 98 b: 73 stride: 2 pad: SAME
1 CONV 99 101 w: 100 b: 40 stride: 1 pad: SAME
2 CONV 101 103 w: 102 b: 39 stride: 1 pad: SAME
3 CONV 103 105 w: 104 b: 38 stride: 1 pad: SAME
4 CONV 105 107 w: 106 b: 72 stride: 2 pad: SAME
5 CONV 107 109 w: 108 b: 37 stride: 1 pad: SAME
6 CONV 109 111 w: 110 b: 71 stride: 1 pad: SAME
7 CONV 111 113 w: 112 b: 36 stride: 1 pad: SAME
8 ADD 113 114 in: 109
9 CONV 114 116 w: 115 b: 70 stride: 1 pad: SAME
10 CONV 116 118 w: 117 b: 35 stride: 1 pad: SAME
11 ADD 118 119 in: 114
12 CONV 119 121 w: 120 b: 69 stride: 1 pad: SAME
13 CONV 121 123 w: 122 b: 34 stride: 1 pad: SAME
14 ADD 123 124 in: 119
15 CONV 124 126 w: 125 b: 68 stride: 2 pad: SAME
16 CONV 126 128 w: 127 b: 33 stride: 1 pad: SAME
17 CONV 128 130 w: 129 b: 67 stride: 1 pad: SAME
18 CONV 130 132 w: 131 b: 32 stride: 1 pad: SAME
19 ADD 132 133 in: 128
20 CONV 133 135 w: 134 b: 66 stride: 1 pad: SAME
21 CONV 135 137 w: 136 b: 31 stride: 1 pad: SAME
22 ADD 137 138 in: 133
23 CONV 138 140 w: 139 b: 65 stride: 1 pad: SAME
24 CONV 140 142 w: 141 b: 30 stride: 1 pad: SAME
25 ADD 142 143 in: 138
26 CONV 143 145 w: 144 b: 62 stride: 1 pad: SAME
27 DWCONV 145 147 w: 146 b: 61 stride: 2 pad: SAME
28 CONV 147 149 w: 148 b: 29 stride: 1 pad: SAME
29 CONV 149 151 w: 150 b: 60 stride: 1 pad: SAME
30 DWCONV 151 153 w: 152 b: 59 stride: 1 pad: SAME
31 CONV 153 155 w: 154 b: 28 stride: 1 pad: SAME
32 ADD 155 156 in: 149
33 CONV 156 158 w: 157 b: 64 stride: 1 pad: SAME
34 CONV 158 160 w: 159 b: 27 stride: 1 pad: SAME
35 ADD 160 161 in: 156
36 CONV 161 163 w: 162 b: 63 stride: 1 pad: SAME
37 CONV 163 165 w: 164 b: 26 stride: 1 pad: SAME
38 ADD 165 166 in: 161
39 CONV 166 168 w: 167 b: 56 stride: 1 pad: SAME
40 DWCONV 168 170 w: 169 b: 55 stride: 1 pad: SAME
41 CONV 170 172 w: 171 b: 25 stride: 1 pad: SAME
42 CONV 172 174 w: 173 b: 54 stride: 1 pad: SAME
43 DWCONV 174 176 w: 175 b: 53 stride: 1 pad: SAME
44 CONV 176 178 w: 177 b: 24 stride: 1 pad: SAME
45 ADD 178 179 in: 172
46 CONV 179 181 w: 180 b: 52 stride: 1 pad: SAME
47 DWCONV 181 183 w: 182 b: 51 stride: 1 pad: SAME
48 CONV 183 185 w: 184 b: 23 stride: 1 pad: SAME
49 ADD 185 186 in: 179
50 CONV 186 188 w: 187 b: 50 stride: 1 pad: SAME
51 DWCONV 188 190 w: 189 b: 49 stride: 1 pad: SAME
52 CONV 190 192 w: 191 b: 22 stride: 1 pad: SAME
53 ADD 192 193 in: 186
54 DWCONV 193 195 w: 194 b: 97 stride: 1 pad: SAME
55 CONV 195 197 w: 196 b: 21 stride: 1 pad: SAME
56 DWCONV 193 200 w: 199 b: 96 stride: 1 pad: SAME
57 CONV 200 202 w: 201 b: 20 stride: 1 pad: SAME
58 CONV 193 205 w: 204 b: 48 stride: 1 pad: SAME
59 DWCONV 205 207 w: 206 b: 47 stride: 2 pad: SAME
60 CONV 207 209 w: 208 b: 19 stride: 1 pad: SAME
61 CONV 209 211 w: 210 b: 46 stride: 1 pad: SAME
62 DWCONV 211 213 w: 212 b: 45 stride: 1 pad: SAME
63 CONV 213 215 w: 214 b: 18 stride: 1 pad: SAME
64 ADD 215 216 in: 209
65 CONV 216 218 w: 217 b: 44 stride: 1 pad: SAME
66 DWCONV 218 220 w: 219 b: 43 stride: 1 pad: SAME
67 CONV 220 222 w: 221 b: 17 stride: 1 pad: SAME
68 ADD 222 223 in: 216
69 CONV 223 225 w: 224 b: 42 stride: 1 pad: SAME
70 DWCONV 225 227 w: 226 b: 41 stride: 1 pad: SAME
71 CONV 227 229 w: 228 b: 16 stride: 1 pad: SAME
72 ADD 229 230 in: 223
73 CONV 230 232 w: 231 b: 58 stride: 1 pad: SAME
74 DWCONV 232 234 w: 233 b: 57 stride: 1 pad: SAME
75 CONV 234 236 w: 235 b: 15 stride: 1 pad: SAME
76 DWCONV 236 238 w: 237 b: 95 stride: 1 pad: SAME
77 CONV 238 240 w: 239 b: 14 stride: 1 pad: SAME
78 DWCONV 236 244 w: 243 b: 94 stride: 1 pad: SAME
79 CONV 244 246 w: 245 b: 13 stride: 1 pad: SAME
80 CONV 236 250 w: 249 b: 85 stride: 1 pad: SAME
81 DWCONV 250 252 w: 251 b: 80 stride: 2 pad: SAME
82 CONV 252 254 w: 253 b: 81 stride: 1 pad: SAME
83 DWCONV 254 256 w: 255 b: 93 stride: 1 pad: SAME
84 CONV 256 258 w: 257 b: 12 stride: 1 pad: SAME
85 DWCONV 254 262 w: 261 b: 92 stride: 1 pad: SAME
86 CONV 262 264 w: 263 b: 11 stride: 1 pad: SAME
87 CONV 254 268 w: 267 b: 84 stride: 1 pad: SAME
88 DWCONV 268 270 w: 269 b: 78 stride: 2 pad: SAME
89 CONV 270 272 w: 271 b: 79 stride: 1 pad: SAME
90 DWCONV 272 274 w: 273 b: 91 stride: 1 pad: SAME
91 CONV 274 276 w: 275 b: 10 stride: 1 pad: SAME
92 DWCONV 272 280 w: 279 b: 90 stride: 1 pad: SAME
93 CONV 280 282 w: 281 b: 9 stride: 1 pad: SAME
94 CONV 272 286 w: 285 b: 83 stride: 1 pad: SAME
95 DWCONV 286 288 w: 287 b: 76 stride: 2 pad: SAME
96 CONV 288 290 w: 289 b: 77 stride: 1 pad: SAME
97 DWCONV 290 292 w: 291 b: 89 stride: 1 pad: SAME
98 CONV 292 294 w: 293 b: 8 stride: 1 pad: SAME
99 DWCONV 290 298 w: 297 b: 88 stride: 1 pad: SAME
100 CONV 298 300 w: 299 b: 7 stride: 1 pad: SAME
101 CONV 290 304 w: 303 b: 82 stride: 1 pad: SAME
102 DWCONV 304 306 w: 305 b: 74 stride: 2 pad: SAME
103 CONV 306 308 w: 307 b: 75 stride: 1 pad: SAME
104 DWCONV 308 310 w: 309 b: 87 stride: 1 pad: SAME
105 CONV 310 312 w: 311 b: 6 stride: 1 pad: SAME
106 DWCONV 308 319 w: 318 b: 86 stride: 1 pad: SAME
107 CONV 319 321 w: 320 b: 5 stride: 1 pad: SAME
teflon: compiled graph, took 69242 ms
INFO: Created TensorFlow Lite XNNPACK delegate for CPU.
teflon: invoked graph, took 543 ms
teflon: invoked graph, took 510 ms
teflon: invoked graph, took 579 ms
teflon: invoked graph, took 507 ms
teflon: invoked graph, took 512 ms
time: 528.116ms
I wonder if I should just start over with a fresh debian install.
Tomeu says try ETNA_MESA_DEBUG=npu_no_parallel
.
Awesome! That worked, Iâm getting about 24ms inference time with the SSDlite-MobileDet model.
I will continue with some testing and come back if I have any other issues.
I do have some questions or looking for some guidance on utilizing the hardware decoders on the Alta with gstreamer but I will open another topic on that.