PRs to fix deserialization problems with Ollama in `sttp-openai`

When using sttp-openai with a locally running instance of Ollama, I ran into several serialization issues. These PRs should fix the problems:

Deserialize to custom model where possible
Fix Choices Deserialization

I’d appreciate a review when someone has the time!