Putting R in production is not only possible but also simple and straight-forward. I have put various R models into production, levering H2O, Plumber, Docker, CloudRun, MLFlow, Spark Pipelines, and AWS Lambdas. Of course, there are different ways of achieving your outcome, namely having a hosted and scalable API for your scorer or classifier. But what is the best method for you?
I believe that this depends on your dev-ops and engineering department (if you are lucky to have one). Most often, your engineers are most comfortable with Python environments and docker for deployment purposes.
I found that serving up R (or a mix of R and Python if you prefer) is straight forward. Nik Agarwal has written a blog post about his experience, and there more examples out there. For most of my projects, I use the same data-wrangling and modeling environments as he does: Tidyverse and Tidymodels.
Packing is essential and lets you stay on top of your deployment. We will talk more about this in a series of posts that I plan to provide over the next months. One of my favorite way to package up my work is using Mojos from H2o.ai.