Thursday, July 20, 2017

Examples of CLI detecting filesystem on unmounted Block Device

Bookmark and Share
Examples of CLI detecting filesystem on unmounted Block Device
Examples of CLI detecting filesystem on unmounted Block Device

Below are the examples of Linux command line (CLI) that can be used to detect or identify filesystem on unmounted block devices.

root@vbox1:~# fdisk -l /dev/sdb
Disk /dev/sdb: 20 GiB, 21474836480 bytes, 41943040 sectors
Units: sectors of 1 * 512 = 512 bytes
Sector size (logical/physical): 512 bytes / 512 bytes
I/O size (minimum/optimal): 512 bytes / 512 bytes
Disklabel type: dos
Disk identifier: 0x00000000

Device                                Boot Start      End  Sectors Size Id Type
/dev/sdb1 *    16065 41929649 41913585  20G 83 Linux
root@vbox1:~#

root@vbox1:~# blkid /dev/sda
/dev/sda: PTUUID="75d4a897" PTTYPE="dos"
root@vbox1:~# blkid /dev/sda1
/dev/sda1: UUID="4146cccc-7d36-4a23-af4c-972161e75220" TYPE="ext2" PARTUUID="75d4a897-01"

root@vbox1:~# file -s /dev/sda1
/dev/sda1: Linux rev 1.0 ext2 filesystem data (mounted or unclean), UUID=4146cccc-7d36-4a23-af4c-972161e75220 (large files)
root@vbox1:~#

root@vbox1:~# lsblk -f
NAME                            FSTYPE      LABEL UUID                                   MOUNTPOINT
fd0
sda
├─sda1                          ext2              4146cccc-7d36-4a23-af4c-972161e75220   /boot
├─sda2
└─sda5                          LVM2_member       QlTPdH-REeF-ljiz-lx2l-gF60-YQRj-MKeITR
  ├─vbox1--vg-root    ext4              76d0a16c-4b6c-4567-a1cb-cb1aa45ab28c   /
  └─vbox1--vg-swap_1  swap              2f357d2f-172a-41fc-9a26-32b1cc1b17cd   [SWAP]
sdb
├─sdb3
└─sdb8
sdc
├─sdc3
└─sdc8
sr0
 

# echo Hope it helps!
Hope it helps!

Thursday, July 13, 2017

Practical way to build nfs-ganesh from source on Ubuntu 16.04 Linux

Bookmark and Share

Practical way to build nfs-ganesh from source on Ubuntu 16.04 Linux

This post is motivated from the issues faced while building nfs-ganesha from sources.
Refer to link https://github.com/nfs-ganesha/nfs-ganesha/wiki/Compiling for official steps.

I faced issues while compling on Ubuntu 16.04 and after lot of debugging come up with below steps to build nfs-ganesha on Ubuntu 16.04

# git clone https://github.com/nfs-ganesha/nfs-ganesha.git
# cd nfs-ganesha
# git submodule update --init --recursive
# apt-get install g++ libboost-dev cmake make git doxygen
# apt-get install  build-essential libglu1-mesa-dev libc6-dev
# cmake $PWD/src
# lsb_release -a
No LSB modules are available.
Distributor ID: Ubuntu
Description:    Ubuntu 16.04.2 LTS
Release:        16.04
Codename:       xenial

# make
Scanning dependencies of target ntirpc
[  0%] Building C object libntirpc/src/CMakeFiles/ntirpc.dir/auth_none.c.o
[  0%] Building C object libntirpc/src/CMakeFiles/ntirpc.dir/auth_unix.c.o
[  1%] Building C object libntirpc/src/CMakeFiles/ntirpc.dir/authunix_prot.c.o
[  1%] Building C object libntirpc/src/CMakeFiles/ntirpc.dir/bindresvport.c.o
[  1%] Building C object libntirpc/src/CMakeFiles/ntirpc.dir/bsd_epoll.c.o
[  1%] Building C object libntirpc/src/CMakeFiles/ntirpc.dir/city.c.o
[  2%] Building C object libntirpc/src/CMakeFiles/ntirpc.dir/clnt_bcast.c.o
[  2%] Building C object libntirpc/src/CMakeFiles/ntirpc.dir/clnt_dg.c.o
[  2%] Building C object libntirpc/src/CMakeFiles/ntirpc.dir/clnt_generic.c.o
[  2%] Building C object libntirpc/src/CMakeFiles/ntirpc.dir/clnt_perror.c.o
[  2%] Building C object libntirpc/src/CMakeFiles/ntirpc.dir/clnt_raw.c.o
[  3%] Building C object libntirpc/src/CMakeFiles/ntirpc.dir/clnt_simple.c.o
[  3%] Building C object libntirpc/src/CMakeFiles/ntirpc.dir/clnt_vc.c.o
[  3%] Building C object libntirpc/src/CMakeFiles/ntirpc.dir/getnetconfig.c.o
[  3%] Building C object libntirpc/src/CMakeFiles/ntirpc.dir/getnetpath.c.o
[  4%] Building C object libntirpc/src/CMakeFiles/ntirpc.dir/getpeereid.c.o
[  4%] Building C object libntirpc/src/CMakeFiles/ntirpc.dir/getrpcent.c.o
[  4%] Building C object libntirpc/src/CMakeFiles/ntirpc.dir/getrpcport.c.o
[  4%] Building C object libntirpc/src/CMakeFiles/ntirpc.dir/mt_misc.c.o

.
.
.
.
[ 96%] Built target FsalCore
Scanning dependencies of target ganesha.nfsd
[ 96%] Building C object MainNFSD/CMakeFiles/ganesha.nfsd.dir/nfs_main.c.o
[ 96%] Building C object MainNFSD/CMakeFiles/ganesha.nfsd.dir/__/FSAL/fsal_convert.c.o
[ 96%] Building C object MainNFSD/CMakeFiles/ganesha.nfsd.dir/__/FSAL/commonlib.c.o
[ 97%] Building C object MainNFSD/CMakeFiles/ganesha.nfsd.dir/__/FSAL/fsal_manager.c.o
[ 97%] Building C object MainNFSD/CMakeFiles/ganesha.nfsd.dir/__/FSAL/access_check.c.o
[ 97%] Building C object MainNFSD/CMakeFiles/ganesha.nfsd.dir/__/FSAL/fsal_config.c.o
[ 97%] Building C object MainNFSD/CMakeFiles/ganesha.nfsd.dir/__/FSAL/default_methods.c.o
[ 98%] Building C object MainNFSD/CMakeFiles/ganesha.nfsd.dir/__/FSAL/common_pnfs.c.o
[ 98%] Building C object MainNFSD/CMakeFiles/ganesha.nfsd.dir/__/FSAL/fsal_destroyer.c.o
[ 98%] Building C object MainNFSD/CMakeFiles/ganesha.nfsd.dir/__/FSAL/fsal_helper.c.o
[ 98%] Building C object MainNFSD/CMakeFiles/ganesha.nfsd.dir/__/FSAL_UP/fsal_up_top.c.o
[100%] Building C object MainNFSD/CMakeFiles/ganesha.nfsd.dir/__/FSAL_UP/fsal_up_async.c.o
[100%] Linking C executable ganesha.nfsd
[100%] Built target ganesha.nfsd

# make install
[ 17%] Built target ntirpc
[ 17%] Built target log
[ 19%] Built target config_parsing
[ 22%] Built target cidr
[ 23%] Built target avltree
[ 25%] Built target hashtable
[ 29%] Built target sal
[ 30%] Built target rpcal
[ 52%] Built target nfsproto
[ 52%] Built target nfs4callbacks
[ 54%] Built target nfs_mnt_xdr
[ 55%] Built target gos
[ 58%] Built target nlm
[ 59%] Built target string_utils
[ 60%] Built target rquota
[ 68%] Built target 9p
[ 69%] Built target sm_notify.ganesha
[ 70%] Built target netgroup_cache
[ 70%] Built target hash
[ 73%] Built target support
[ 75%] Built target uid2grp
[ 77%] Built target fsalnull
[ 80%] Built target fsalmdcache
[ 81%] Built target fsalpseudo
[ 82%] Built target fsalproxy
[ 87%] Built target fsalgpfs
[ 87%] Built target fsal_os
[ 90%] Built target fsalvfs
[ 91%] Built target fsalmem
[ 91%] Built target idmap
[ 94%] Built target MainServices
[ 96%] Built target FsalCore
[100%] Built target ganesha.nfsd
Linking C shared library CMakeFiles/CMakeRelink.dir/libntirpc.so
Linking C shared module CMakeFiles/CMakeRelink.dir/libfsalnull.so
Linking C shared module CMakeFiles/CMakeRelink.dir/libfsalproxy.so
Linking C shared module CMakeFiles/CMakeRelink.dir/libfsalgpfs.so
Linking C shared module CMakeFiles/CMakeRelink.dir/libfsalvfs.so
Linking C shared library CMakeFiles/CMakeRelink.dir/libfsalmem.so
Linking C executable CMakeFiles/CMakeRelink.dir/ganesha.nfsd
Install the project...

Install the project...
-- Install configuration: "Debug"
-- Installing: /etc/ganesha/ganesha.conf
-- Installing: /usr/share/doc/ganesha/config_samples
-- Installing: /usr/share/doc/ganesha/config_samples/gpfs.ganesha.nfsd.conf
-- Installing: /usr/share/doc/ganesha/config_samples/gpfs.conf
-- Installing: /usr/share/doc/ganesha/config_samples/rgw_bucket.conf
-- Installing: /usr/share/doc/ganesha/config_samples/xfs.conf

.
.
.
.
.
.
-- Installing: /usr/include/ntirpc/namespace.h
-- Installing: /usr/include/ntirpc/fpmath.h
-- Installing: /usr/include/ntirpc/version.h
-- Installing: /usr/lib/libntirpc.so.1.5.1
-- Installing: /usr/lib/libntirpc.so.1.5
-- Installing: /usr/lib/libntirpc.so
-- Installing: /usr/lib/ganesha/libfsalnull.so
-- Installing: /usr/lib/ganesha/libfsalproxy.so
-- Installing: /usr/lib/ganesha/libfsalgpfs.so
-- Installing: /usr/lib/ganesha/libfsalvfs.so
-- Installing: /usr/lib/ganesha/libfsalmem.so.4.2.0
-- Installing: /usr/lib/ganesha/libfsalmem.so.4
-- Installing: /usr/lib/ganesha/libfsalmem.so
-- Installing: /usr/bin/ganesha.nfsd

#
#echo That's it.
That's it.


Leave comments if this post helps in any ways.

Thursday, June 22, 2017

How to delete KVM domains/VMs?

Bookmark and Share

How to delete KVM domains/VMs?

root@techsutram-vm2:~# /usr/bin/virsh list --all
 Id    Name                           State
----------------------------------------------------
 -     instance-00000003              shut off
 -     instance-00000005              shut off

root@ techsutram-vm2:~# /usr/bin/virsh undefine instance-00000003
Domain instance-00000003 has been undefined

root@ techsutram-vm2:~# /usr/bin/virsh list --all
 Id    Name                           State
----------------------------------------------------
 -     instance-00000005              shut off

root@ techsutram-vm2:~# /usr/bin/virsh undefine instance-00000005
Domain instance-00000005 has been undefined

root@ techsutram-vm2:~# /usr/bin/virsh list --all
 Id    Name                           State
----------------------------------------------------

root@ techsutram-vm2:~#

Tuesday, March 17, 2015

How to restore and preserve multiple Screen sessions?

Bookmark and Share


A new problem and another quick solution!

As usual while working on many server (imagine 20 Linux/Unix machines), it is often difficult to remember all machine names, their configuration and yet to keep all individual setups at fingertips.
The obvious choice on Linux is using SCREEN multiplexer.

But there are some limitations :(
What if you the machine running Screen sessions reboots or shuts down? Starting each and individual session after your screen session machine is up, is very time consuming and more or less very uncomfortable during stressful times ( using alternate wordings for being lazy :) )

I wrote following (very simple wrapper) scripts on top of Screen commands to bring up all sessions as quickly as possible with single Screen command. It should be noted that all machine names and other terms are generalized in below session commands.

[root@techsutram.com screen-sessions]# ls
Setup_1.ssh  Setup_2.ssh  Setup_3.ssh  Setup_4.ssh
[root@techsutram.com screen-sessions]# cat Setup_1.ssh
screen -t DRIVER_EXECUTION ssh root@launcher.techsutram.com
screen -t node1 ssh root@node1.techsutram.com
screen -t node2 ssh root@node2.techsutram.com
screen -t node3 ssh root@node3.techsutram.com
screen -t node4 ssh root@node4.techsutram.com

[root@techsutram.com screen-sessions]# screen -S _SETUP_1_ -c ~/screen-sessions/Setup_1.ssh


It will ask for passwords (if required) for each individual machines (^a" and select node? ).
Hopefully it helps! At least it will help me sometime in future.

Tuesday, November 4, 2014

Balanced Scorecard approach to Software Quality

Bookmark and Share


Few months back, I was introduced to Balanced scorecard concept by Rajul @Sunstone.
It essentially maps business strategy to customer, finance, internal processes and learning & growth perspectives. More information available at Balanced scorecard.

However, Balanced scorecard framework invoked a thought process to apply the same to Software Quality.  Below table applies Balanced scorecard to Software Quality. Measurement or tracking of each Key Performance Indicators (KPIs) could be a matter of debate as these could be tracked weekly, monthly, quarterly or yearly.

Formulae listed in the below table can be tweaked easily to individual needs. No guarantee of any sorts :).  Few of these KPIs are available on internet on different software testing or quality assurance forums. This article tries to put these in balanced scorecard framework.

Assumption:
  • We know how to calculate total cost of testing efforts  
  • There could be other KPI that individual leverage but I cannot list everything here. KPIs are for example purpose only.
 
Customer
Sr. No.
Objective
Measure
Target
Initiative
1.
Improve on features shipped
Number of Feature Request
Identify top 10 features requested by customers
1.     Analyze Escalations, mailing lists, sales inputs
2.     How many Escalations/mailing lists/sales inputs qualify as features?
2.
Reduce critical bugs in production
Number of critical bugs reported by customer
Reduce critical bugs to 10% with respect to previous release

Final goal: zero critical bugs in production
1.     Analyze Escalations, customer reported incidents, mailing list etc.
3.
Improve product delivery Cycle time
Automation productivity to expedite delivery
Improve the automation productivity consistently release over release.
e.g.
Following targets could be considered:
1. Improve by 100%
2. Improve by 80%
3. Improve by 50%
1.     Use following formula to track automation productivity:

= (Total number of automated tests) / total automation efforts
4.      
Improve product delivery Cycle time
Test Cycle time
Reduction in total testing time
1.     Use following formula to track reduction in total test cycle time

= Total testing downtime/ Total test execution time
5.      
Innovation
Number of new ideas generated
Improve on idea implementation into product
1.     Use following formula to track reduction in total test cycle time

= Number of new ideas (suggestions) /
Ideas (Suggestions) implemented





 Finance
Sr. No.
Objective
Measure
Target
Initiative
1.      
Reduce Software testing cost
Identify cost per test case
Reduce cost per test case
1.     Use following formula to track cost per test case

= Total cost / number of test cases
2
Reduce Software testing cost
Identify cost per automated test case
Reduce cost per automated test case
1.     Use following formula to track cost per test case

= (Total automation cost) / (number of automated test cases)
3
Reduce cost of minor releases
Identify release cost
Reduce cost of minor release as compared to previous minor releases
1.     Use following formula to track cost per release

= Minor (No. Of release defects filled + No. Of release resources) / Major (No. Of release defects filed + No. Of release resources) + Minor (No. Of release defects filled + No. Of release resources)

2.     Consistently track the output over all releases
4
Reduce cost of major releases
Identify release cost
Reduce cost of minor release as compared to previous minor releases
1.     Use following formula to track cost per release

= Major (No. Of release defects filled + No. Of release resources) / Major (No. Of release defects filed + No. Of release resources) + Minor (No. Of release defects filled + No. Of release resources)

2.     Consistently track the output over all releases





Internal processes
Sr. No.
Objective
Measure
Target
Initiative
1.      
Improve test effectiveness
Measure Automation percentage
Improve overall automation percentage release over release to bring more test effectiveness
1.     Use following formula to improve test effectiveness by measuring percentage of automation

= (Automated tests) / (Manual tests + Automated tests)
2.
Improve test effectiveness
Measure manual percentage
Reduce manual efforts close to zero
1.     Use following formula to improve test effectiveness by measuring percentage of manual tests

= (Manual tests) / (Manual tests + Automated tests)
3.
Improve test effectiveness
Defect slippage to internal customers and in production (consider deferred incidents as well)
Zero Defect slippage to internal customers and in production (consider deferred incidents as well)
1.     Use following formula to improve test effectiveness by measuring defect slippage

= (No. Of defects by internal customers) /  (No. Of defects by internal customers+ No. Of deferred defects)
4.
Improve software testing quality
Defect Severity distribution
Zero high severity incidents
1.     Use following formula to improve software testing quality

= (No. Of Sev 1 and Sev 2 defects) / (Total number of defects)
5.
Improve test effectiveness (Operations)
Measure number of parallel qualifications as compared to major/minor releases
Ideal targets could be e.g.
1. Less than Number of major and minor releases combined
2. Less than m
1.     Use following formula to improve test effectiveness by measuring number of qualifications in progress

= (Number of non-release qualifications) / (Number of major releases + Number of minor releases)
6.
Improve documentation
Measure level of completeness, accuracy, simplicity
Improve quality of documentation
1.     Number of new documents per release
2.     Number of new updates per document per release
3.     Percentages of new updates with respect to new features
7.      
Improve software testing quality
Tests passing per build (daily)
Set following targets for percent of tests passing per daily build e.g
1. Less than 10 %
2. Less than 5 %
3. Less than 2%
1.     Use following formula to improve test effectiveness by measuring number of qualifications in progress

= (Number of tests passed) / (Total number of tests planned for execution)
8.      
Improve software testing quality
Features per iteration (or release)

Measure story points per iterations in Agile model
Measure how many features that we ship as a part of each iteration (or release)
1.     Use following formula to measure features in iterations

= (Number of features planned in iteration or release) / (Total number of targeted features)
9.      
Improve on business value deliverables
How early we can deliver a value in the release?
In Agile Model, we have to deliver high value in early sprints.

1.     Use following formula to measure features in iterations

Customer value of feature x= Cx points
Storypoint of a feature x = Sx points
% Business value of feature x = Cx / Sx





Learning and Growth
Sr. No.
Objective
Measure
Target
Initiative
1.      
Resource availability
If accepted for qualification then how much number of qualified resources should be available?
We should have sufficient number of resources available to work on every qualification.
1.     Use following formula to identify trained resources that will work on release qualification

= (Resources to work on qualification) / (total number of available resources)
2.      
Knowledge transfer/ Transfer of Information
How many knowledge transfers (Transfer of Information) are happening within testing organization.
At least 1 KT (ToI) per week within team
1.     Use following formula to identify progress on knowledge transfer in our teams

= (ToI delivered) / (Total number of ToI delivered + Total number of ToI available in pipeline)
3.      
Cross-team collaboration
Measure how much efforts were spent by your team (A) to collaborate with other teams (B)
Team collaboration should be 100%
1.     Use following formula to measure team collaboration,

= (Total efforts spent by your team: A) / (total efforts spent by our team: A + total efforts spent by other teams:B)







As I said above table lists KPIs that can be tweaked to individual needs and but I cannot guarantee their implementation lead to success. 

If you want to suggest correction (I think there is scope of improvement) or suggest additional KPIs in each perspective above then leave a comment below.
 




Technology