Open
Benchmarking
4
help wanted
Extra attention is needed
good first issue
Good for newcomers
question
Further information is requested
Can you please add some performance numbers to the main project docs indicating inference latency running some common hardware options e.g. AWS p2, GCP gpu instance, CPU inference, Raspbery pi, etc.