From c91bb519e88daaf51f9dfd47de6c4e29e08d6213 Mon Sep 17 00:00:00 2001 From: Zhu Shangzhong Date: Fri, 28 Jul 2017 20:20:33 +0800 Subject: [PATCH] doc: change the default filesystem/pool names - s/cephfs_data/cephfs_data_a - s/cephfs_metadata/cephfs_metadata_a - s#./rados df#bin/rados df - update the 'bin/rados df' output - remove the rbd pool, it isn't created by default anymore. Signed-off-by: Zhu Shangzhong --- doc/dev/quick_guide.rst | 29 +++++++++++++---------------- 1 file changed, 13 insertions(+), 16 deletions(-) diff --git a/doc/dev/quick_guide.rst b/doc/dev/quick_guide.rst index e7cf350ddb407..d9c6a533b514d 100644 --- a/doc/dev/quick_guide.rst +++ b/doc/dev/quick_guide.rst @@ -34,35 +34,32 @@ You can also configure ``vstart.sh`` to use only one monitor and one metadata se $ MON=1 MDS=1 ../src/vstart.sh -d -n -x -The system creates three pools on startup: `cephfs_data`, `cephfs_metadata`, and `rbd`. Let's get some stats on +The system creates two pools on startup: `cephfs_data_a` and `cephfs_metadata_a`. Let's get some stats on the current pools: .. code:: $ bin/ceph osd pool stats *** DEVELOPER MODE: setting PATH, PYTHONPATH and LD_LIBRARY_PATH *** - pool rbd id 0 - nothing is going on - - pool cephfs_data id 1 + pool cephfs_data_a id 1 nothing is going on - pool cephfs_metadata id 2 + pool cephfs_metadata_a id 2 nothing is going on - $ bin/ceph osd pool stats cephfs_data + $ bin/ceph osd pool stats cephfs_data_a *** DEVELOPER MODE: setting PATH, PYTHONPATH and LD_LIBRARY_PATH *** - pool cephfs_data id 1 + pool cephfs_data_a id 1 nothing is going on - $ ./rados df - pool name category KB objects clones degraded unfound rd rd KB wr wr KB - rbd - 0 0 0 0 0 0 0 0 0 - cephfs_data - 0 0 0 0 0 0 0 0 0 - cephfs_metadata - 2 20 0 40 0 0 0 21 8 - total used 12771536 20 - total avail 3697045460 - total space 3709816996 + $ bin/rados df + POOL_NAME USED OBJECTS CLONES COPIES MISSING_ON_PRIMARY UNFOUND DEGRADED RD_OPS RD WR_OPS WR + cephfs_data_a 0 0 0 0 0 0 0 0 0 0 0 + cephfs_metadata_a 2246 21 0 63 0 0 0 0 0 42 8192 + + total_objects 21 + total_used 244G + total_space 1180G Make a pool and run some benchmarks against it: -- 2.39.5