Depends on the problem statement. When it comes to problem statements around predicting user behavior or generating segments and when the data is huge it's usually pyspark. Mllib and the other typical things. When it comes to other problem statements anything from pytorch to tensorflow etc is used. Certain models are provided out of the box by likes of hugging face.
Any and everything
So machine learning expertise is needed to make ai tools
Why not ask this in the data science community too?