There could be a couple of answers which depend on how your table is organized.
If you have an index on SSN then one of the things you might try is a
simple loop like that:
update sales set ssn=mod(ssn, 10000)
where ssn > 9999
and rownum <= 1000;
exit when sql%rowcount < 1000;
though that might be slow in certain cases (if your clustering factor
for ssn isn't good, for example).
You might try bulk collect/update a shot and see how something like
this will perform for you:
cursor l_cur is select rowid from sales where ssn > 9999;
type l_tab is table of rowid index by binary_integer;
fetch l_cur bulk collect into l_buffer limit 1000;
forall i in 1 .. l_buffer.count
update sales set ssn=mod(ssn,10000) where rowid=l_buffer(i);
exit when l_buffer.count < 1000;
that should be faster compared to row-by-row stuff in your example.
Another thing you could consider is dong CTAS and then swapping the tables.
On Fri, Nov 13, 2009 at 2:50 PM, Roger Xu wrote:
Background: 9i (184.108.40.206.0) HP-UX B.11.31 U ia64
We need to shrink SSN to the last 4 digit.
update sales set ssn=to_number(substr(to_char(ssn),-4));
But the sales table is quite large and we want to commit every 1000 records.
Does anyone have a better idea to do this rather than below?
cursor c12 is select ssn from sales for update of ssn;
n number := 0;
for eachc12 in c12 loop
if eachc12.ssn is not null then
n := n + 1;
update sales set ssn=to_number(substr(to_char(ssn),-4)) where current
if ( n = 1000 )