I have to update certain values of a large table (for the sake of a presumed example, it is called 'Resource' and it is over 5M rows) and thus I have to make a backup before performing the changes. We do not have enough DB free space in order to store the full backed-up table.
Which is the best way? Is there a way to do this by blocks? I mean something like: backing up the first 100K rows from the original table, updating those 100K rows in the original table, deleting those 100K rows from the backed-up table, backing up the following 100K rows from the original table, and proceeding analogously. Is this feasible?