Hacker Newsnew | past | comments | ask | show | jobs | submit | chony's commentslogin

I built an app to visualize and analyze basketball shots and shooting pose with machine learning.

https://github.com/chonyy/AI-basketball-analysis

The result is pretty nice. However, the only problem is the slow inference speed. I'm now refactoring the project structure and changing the model to a much faster YOLO model.


I built this project to automatically overlay baseball pitch motion and trajectory.

https://github.com/chonyy/ML-auto-baseball-pitching-overlay

It's ready for a quick demo. However, there are stiil some little improvements have to make. And I'll build an web app on top of it for people to use it online.


Thanks, will definitely check it out!


Thanks for asking! The only model I used in this project is the YOLOv4 object detection model to detect the ball in each frame. I collected about 200 images to train it.

For the other parts like the tracking and the overlay timing, I programmed it by myself.

I implemented SORT algorithm for tracking the ball and some programming logic to capture the overlay timing from each clip.


I'm glad that you like it! Thanks a lot!


Thanks for asking! This is not a noob question.

I would say that the similar workflow could be applied to any ball-related sports. The object detection and the tracking algorithm is basically the same. Then, you could add any sport-specific feature!

For example, I have used a similar method to build AI Basketball Analysis.

https://github.com/chonyy/AI-basketball-analysis


Great point! Looks so much clearer than the early days of HawkEye as well


I love the idea so much! Any idea how could you make it work on the videos with subtitles turned off?

Also, may I ask for a simple workflow of this project? Here are some of my questions.

1. What method did you use to get the summary out of all the subtitles?

2. How to get the subtitles of the video (Youtube API)?

3. How to get the timestamp of the specific word in the subtitle?

I would really like to build somthing similar! Thanks a lot!


Hi Chony, thanks for asking.

1. What method did you use to get the summary out of all the subtitles?

I measured the similarity between words in each sentence. If words in two sentences are not very semantically similar, they will be divided into two different chapters. As for how I measure their semantic similarity, I used word2vec (it will be more accurate if I use something like BERT but this is just a prototype).

2. How to get the subtitles of the video (Youtube API)?

Subtitles are available on the YouTube video's HTML, you can write a crawler to get them. YouTube API might also be a way.

3. How to get the timestamp of the specific word in the subtitle? I would really like to build something similar! Thanks a lot!

As timestamps are sentence-level only, there is no perfect way to get them for each word. You will need to do the approximation for it. And I didn't do it for my case.

Hope the answers are helpful. Let me know if you have more questions!


This project takes your baseball pitching clips and automatically generates the overlay. A fine-tuned Yolov4 model is used to get the location of the ball. Then, I implemented SORT tracking algorithm to keep track of each individual ball. Lastly, I will apply some image registration techniques to deal with slight camera shift on each clip.

I'm still trying to improve it! Feel free to follow this project, also check out the Todo list.

BTW, I just want to point out that did anyone notice that the pitcher throw the ball with the exact same posture but it turned out to fly on a completely different path. It's just amazing!


Hi guys,

This project takes your baseball pitching clips and automatically generates the overlay. A fine-tuned Yolov4 model is used to get the location of the ball. Then, I implemented SORT tracking algorithm to keep track of each individual ball. Lastly, I will apply some image registration techniques to deal with slight camera shift on each clip.

I'm still trying to improve it! Feel free to follow this project, also checkout the Todo list.

The idea came from this incredible overlay: https://www.youtube.com/watch?v=jUbAAurrnwU&ab_channel=YuTub...


BTW, I just want to point out that did anyone notice that the pitcher throw the ball with the exact same posture but it turned out to fly on a completely different path. It's just amazing!


I'm glad you like it!

As an alternative, feel free to play around with it and get familiar with the setup process in the Google colab notebook.

https://colab.research.google.com/github/hardik0/AI-basketba...


Is there a pure notebook version stripped of the application? I'd like to play with this on iko.ai. Already have the Coco dataset, GPU support, and most of the dependencies in our image.


Hi, do you mean the pure Python ML part wihtout the web app and any Flask thing?


Yes. Purely from a notebook to load the model, give it an image.


Hi, the link provided is the pure ML part of the project, without any Flask thing.

Unfortunately, I don't have written a notebook for it. Maybe you could try to figure it out through the source code? The code itself is actually prettly simple and short.

https://github.com/chonyy/basketball-shot-detection


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: