fsx: fix infinite/too long loops when generating ranges for copy_file_range
authorFilipe Manana <fdmanana@suse.com>
Mon, 20 Apr 2020 17:09:17 +0000 (18:09 +0100)
committerEryu Guan <guaneryu@gmail.com>
Sun, 10 May 2020 12:33:47 +0000 (20:33 +0800)
While running generic/521 I've had fsx taking a lot of CPU time and not
making any progress for several hours. Attaching gdb to the fsx process
revealed that fsx was in the loop that generates the ranges for a
copy_file_range operation, in particular the loop seemed to never end
because the range defined by 'offset2' kept overlapping with the range
defined by 'offset'.
So far this happened one time only in one of my test VMs with generic/521.

Fix this by breaking out of the loop after trying 30 times, like we
currently do for dedupe operations, which results in logging the operation
as skipped.

Signed-off-by: Filipe Manana <fdmanana@suse.com>
Reviewed-by: Brian Foster <bfoster@redhat.com>
Signed-off-by: Eryu Guan <guaneryu@gmail.com>
ltp/fsx.c

index ab64b50ac2132e4d2fb4d7d81f3265495f29b3de..40cbd401b105ba9d1ee21d509d350350d12444e5 100644 (file)
--- a/ltp/fsx.c
+++ b/ltp/fsx.c
@@ -2051,17 +2051,25 @@ test(void)
                        break;
                }
        case OP_COPY_RANGE:
-               TRIM_OFF_LEN(offset, size, file_size);
-               offset -= offset % readbdy;
-               if (o_direct)
-                       size -= size % readbdy;
-               do {
-                       offset2 = random();
-                       TRIM_OFF(offset2, maxfilelen);
-                       offset2 -= offset2 % writebdy;
-               } while (range_overlaps(offset, offset2, size) ||
-                        offset2 + size > maxfilelen);
-               break;
+               {
+                       int tries = 0;
+
+                       TRIM_OFF_LEN(offset, size, file_size);
+                       offset -= offset % readbdy;
+                       if (o_direct)
+                               size -= size % readbdy;
+                       do {
+                               if (tries++ >= 30) {
+                                       size = 0;
+                                       break;
+                               }
+                               offset2 = random();
+                               TRIM_OFF(offset2, maxfilelen);
+                               offset2 -= offset2 % writebdy;
+                       } while (range_overlaps(offset, offset2, size) ||
+                                offset2 + size > maxfilelen);
+                       break;
+               }
        }
 
 have_op: