Executing an ALTER TABLE safely against very large tables

Hello,
I need to run this query in a production environment

alter table METERING.T_VALORI_HH
add (col_name VARCHAR(3));

My concern is related to run a query against a table that currently contains 5 billion rows. Is this DDL safe? Is there a reason why I should be worried about DB corruption? Can I estimate how long will it take?

Thanks a lot.

Yes, this is safe. It will be done in a transaction. I’d recommend doing a backup before you do it for an extra measure of safety.

Regarding how long it will take, let me see if I can have someone else comment on that.

Okay. Thanks. I need to know if I can estimate how long it will take.
Thanks again.

We don’t have any built-in estimator for how long this will take. It will be faster for a columnstore because that doesn’t require updating every row to add a new field – it will be done with a metadata update to each segment (1 million row chunk of a table).

A workaround to get a time estimate would be to, say, put 1% of the total table into a separate scratch table, try the operation there, then multiply the time the operation took on the scratch table by 100 to get an estimate for doing it for the actual table. Or you could use a different fraction, say less than 1%, whatever works for you.