need to disambiguate some things: First Cloud Run is technically "serverless" because it ticks the serverless tenets:
• no need to manage servers
• scales automatically (in/out)
• price per usage
• fault tolerance
I'm going to assume that by "serverless" you mean "Functions as a service" (refered to as "serverless functions"/'FaaS") such as AWS Lambda/ Azure Functions / Google Cloud Functions.
Cloud Run is essentially serverless containers ie you provide a container image and Google will run it for you. AWS' equivalent of this is Fargate. In contrast, serverless aka Functions as a service accept a function or collection of functions as a zip package and are expected to have some constraints: typically the run time of serverless functions is limited where as serverless containers do not have execution time limits.
there's no better - these are two fundamentally different ways of operation. Do you have a task which runs for a short period and can be invoked at any time? Serverless Functions is the way to go. Do you have tasks which run for longer periods of time, or cannot have arbitrary time execution limits? Serverless containers is the way to go.
Cost depends on the cloud provider but Serverless Functions, most providers have settled for a unit called "GB-seconds" which is the number of seconds a functions run multiplied by how much RAM the function has been allocated. These are multiplied by number of times a function is invoked.
To give you an example, if you have a function which runs every hour to send a chunk of emails, and this function takes 10 seconds to run and needs 128MB of RAM, per Google's pricing, this function would be free if you do not have any other functions because Google doesn't charge for the first 2 million invocations, but this function will be invoked 720 times a month (24 times a day, 30 days).
Assuming you have already exceeded the 2 million invocations limit, then would have to calculate the cost: the pricing for this is detailed at
https://cloud.google.com/functions/pricing
So assuming the function takes 256 MB of RAM, then the function invocation would cost ($0.0000025/4 * 720*10) =~ $0.0045 per month
(why divide by 4? GB-sec is 0.0000025, we need only 256MB. Multiply by 10 because each invocation runs for 10 seconds). I did not consider GHz-sec but you can find more details there.