Expert : DEVASHIS NAYAK

Ash
Expert in job
Not Specified
Expert Skills

Software Testing, Quality Assurance, Improving test Effectiveness, End-to-end automation, Requirements Based Testing, Requirements Validation, Model Based Testing, Scipt less testing

Expert Brief Profile

      MD & CEO, RBT Technologies, B.E., M.S. (Electrical Engineering) having twenty years of work experience including seven years as QA Manager in IT sector, fourteen years in software product testing, Quality Management, and Project Management.  

Expert Detail Profile

RBT Technologies Pvt. Ltd. May 2011 to till date (An alliance partner of BenderRBT Inc., NY, USA) MD & CEO The job focuses on the company's strategic direction and executive business management, and leading the company's innovative approach to software testing, test automation, methods and tools, research and education programs, Solution Offerings, Technology Consulting and Thought Leadership. Responsibility of this position included but not limited to quality process assessments, compliance with process standard, exploitation of measurement that help build Testing Centre of Excellence (TCoE); practical, workshop oriented training classes; hands -on mentoring; and turn-key testing services with an objective to assist clients in improving the quality of their software while reducing time to deliver and reducing cost to deliver. Major contribution to the novel approaches are in the areas of: • Test case design: A test design technique based on Cause-Effect graph and path sensitizing technique that finds faults both in requirements and code and that can be applied to System testing, Integration Testing and Unit Testing. • Agile methodologies: Determining what modification needed to the old test cases and what additional test cases required to get full coverage in the wake of change in requirements, and implementing Test-driven development (TDD) on requirements. • Requirements Based Testing: Ensuring that requirements are correct, complete, logically consistence, unambiguous and irredundant (Design for Testability), and creating a necessary and sufficient set of test cases, from a black box perspective, to ensure that the design and code fully meet those requirements. • Structural coverage: How to measure the structural coverage (MC/DC) obtained by executing the Black-box (functional) test cases without any coverage analyser tool and to supplement tests to attain 100% MC/DC coverage. • Test effectiveness: How to improve and measure test effectiveness quantitatively with regard to structural coverage, fault finding ability and size of the test suite. • Test case prioritization: Identifying test cases with high requirement complexity (e.g., fan-out) and volatility; Sub setting the test library; Risk Based Testing. • Regression test selection from specification: Making regression test selection criteria more objective via modification identification and selecting regression test suite that is different and effective than the current state of the art. • Test coverage: How coverage matrix affects test planning. • Test productivity: How to improve test productivity (i.e., rate of code and fault detection), and defect removal efficiency and rate. Azilon Software Solutions Pvt. Ltd., Bangalore, India Jan 2004 to March 2011 QA Manager The job focuses on all phases of software development process to improve the usability, design, feature capabilities and reliability of the product. The role also demands to thoroughly explore the space of product concept by external search, creative problem solving, and systematic exploration of the various solution fragments the team generates that may address the customer needs, to build cross functional team, and to maintain strong communication with all the stakeholders. Responsibility of this are: • Involve in requirement analysis, establishing target specification, preparing functional definition, and conceptualization of the product include: concept generation, concept testing by ‘competitive benchmarking’ and ‘task analysis’. Implement human factor analysis (ergonomics) and guide UI programmer to develop proof-of-concept model i.e., prototyping. • Prepare Test Plans, Test Design Techniques (Structural and Black-box), and strategy, identifying Test Requirements, Test case selection and prioritization, designing and executing Test Cases for integration, regression, system, user acceptance, load , performance and scalable test (manual and automated testing of Client/Server and WEB applications using WinRunner, QTP, LoadRunner, TestDirector, OpenSTA.) and managing ‘Performance Engineering’ team to model, measure, and tune .NET and JAVA Application performance. • Chair all review meetings (walkthrough) with CTO and customers and ensure the design review from performance and scalability perspectives. Provide performance sign-off and review results with the management and other technical groups. Establish product certification that spell out the level of testing or inspection involved and any standard met by the program, the development process or testing process. • Plan, deploy, and manage the QA effort for any given engagement / release. Identify alignment issues between testing and rest of the organization, measure performance and productivity of QA personnel and hire QA personnel. Monitor defect tracking and reporting to improve communications and reduce delay. Provide metrics on bugs and track bug trends, backlog trends and velocity (for agile projects). • Implement testing and automation tools best practices, simulating testing to production environment, ensure tests and procedures are properly understood, carried out and evaluated and that product modifications are investigated if necessary. • Prepare ‘Weekly Status Report’ and communicate to CTO/VP product development. • Implement and evolve appropriate measurements and metrics and perform structured review of quality activities that identify the lessons learned and help optimize the process. • Monitor and analyze test effectiveness, test code coverage, product code changes, and changes to dependencies to prioritize the testing. Identify appropriate test metrics and decide test adequacy criteria. • Analyze the design issue and feature capabilities for enhancement of the existing products and trace requirement compliance across life cycle phases. • Create test plans, decide testing strategies, write and execute test cases, and map test cases to business requirement, build and update regression automation suite. Identify test scenarios and create scripts, workload profile, design and implement comprehensive load/ stress testing scenario along with the associated configuration/ setting changes to perform Capacity planning, Soak test etc. • Identify and configure test environment, identify performance acceptance criteria, implement test design. Measure performance of Web based application (.NET and JAVA), provide performance tuning recommendation, scale up or scale out option, analyze performance test result to find bottleneck, to evaluate impact on production infrastructure, and to mitigate risk related to speed, scalability, and stability. • Build cross-functional team and guide the team through development process to ensure quality of deliverables. This include leading the team to identify performance objectives, measure performance and formalize them in a service-level agreement (SLA), to perform risk analysis, feature set selection, code reviews, standards compliance-checking, error guessing, Grey box testing comparative product evaluation, capacity planning etc.