Loading presentation...

Present Remotely

Send the link below via email or IM

Copy

Present to your audience

Start remote presentation

  • Invited audience members will follow you as you navigate and present
  • People invited to a presentation do not need a Prezi account
  • This link expires 10 minutes after you close the presentation
  • A maximum of 30 users can follow your presentation
  • Learn more about this feature in our knowledge base article

Do you really want to delete this prezi?

Neither you, nor the coeditors you shared it with will be able to recover it again.

DeleteCancel

Software Development Process Control with Scrum & Kanban

Presents an approach to software development process control focusing on Scrum & Kanban practices
by

Rafał Pokrywka

on 14 October 2013

Comments (0)

Please log in to add your comment.

Report abuse

Transcript of Software Development Process Control with Scrum & Kanban

List of issues for sprint
Committed to and controlled by Dev Team
Planned during Sprint Planning Meeting
Estimation done during sprint planning meeting
Can change but only by Dev Team
Scrum Framework
Sprint
Integration Tests
Coding Standards
organized
commented
naming conventions followed
method size (etc. < 40)
class size (etc. < 1000)

Code Conventions
Good Unit Tests
it is a code - neads maintenance
every test should be independent of others
organized in commented sections
with good coverage of functionality





don't use "new" -> FactoryMethods
configuration by defaults -> ConfigurationMethods
complex assertion -> VerificationMethods



one assertion per test and with apriopriate message
only code relevant to all tests in setup method
local over global vars
no need to test private methods
What Tests?
Regression Tests
Exploratory Tests
System Tests
Acceptance Tests
Non-Functional Tests
Usability Tests
Load Tests
Stress Tests
Sprint Definition of Done
Stories Done (DoD))
All Defects Fixed
...
Testability
Degree to which a software artifact is easy to test in a given test context
Test Objective
Test level - Unit/Component/System
Test harness - tools and environment
Test type - functional/non-functional
?
Requirements
Design/Specification
Documentation
Code
Coding Standards
TDD

Dependency Optimalisation - Dependency Injecton
Mocks

Documentation and Specification
Logs
Working
Increment

Sprint
Backlog

Product
Backlog

Logs
Design Patterns
Dependency optimalisation - Dependency Injection
Red -> Green -> Refactor
UI tests
Persistance Layer
Mock
Problems
promotes productivity increase
modular, flexible, extensible code -> small unit tested
focus on interfaces
easier to mock
Unit Tests exists !
Good Test Case
story id
short abstract (title)
defined goal and description
preconditions (environment & setup)
postconditions (cleanup)
test procedure (step by step)
expected results - verification part
Mark For Regression
Thanks !
test right things
logic test cases
focus on coverage
But...
"Pigs & Chickens"
Release Plan
Product Owner
Scrum Master
Test Cases
business scenarios
negative paths
technics -> equivalence partitioning, boundary values, decision tables
data driven scenarios
focus on coverage (branch, data)
standup
Sprint planning
burndown
review
retrospection
Functional/non-functional
Who & What
Rafał Pokrywka
Experienced developer, tester, tech lead, manager
IBM SWG Lab
Test automation framework (2 projekty)
Data Acquisition, Cloud Computing projects etc.
VirtusLab
SMB Projects
SaaS
Test Automation
Luminis-Research
Text Mining Projects
PhD in progress on Root Cause of Anomlies in Software Development Process
Agenda
Software Development Process
Scrum
Kanban
QA
Code Review
Agile Testing
Test Automation
CI
Coverage
ALM
Continuous Delivery
ALM Platform
Raporting
Once Upon A Time
Reqs
Design
Implementation
Tests
Maintenance
Waterfall
Would work if we understood
everything everytime
Defined Process
Doomed To Fail
It Project Success Rate 1994:
~15%
Cost & Time Overrun:
~170%
Ir Project Success Rate 2004:
~34%
Cost & Time Overrun:
~70%
Features & Functions used:
Never 45%
Rarely 19%
Sometimes 16%
Often 14%
Always 7%
Henrik Kniberg
http://www.infoq.com/articles/Interview-Johnson-Standish-CHAOS
SDP as Empirical Process
Everything Changes
Reqs
Team Skills & Size
Business
Customers needs
Domain Understanding
Control through Inspection and Adaptation for a process that is imperfectly defined and generate unpredictable and unrepeatable outputs.
Statistical Process Control uses control charts to measure results of experiments as a means for continuous improvement.
Variation is inherent ingredient of Software Production
Measure
Experiment & Adapt
Input
Output
Complex Adaptive Systems
RUP
SCRUM
Kanban
?
LEAN
120
XP
12
9
5
0
feedback
1996 r.
1996 r.
1993-1995 r.
Pair Programming
TDD
Planning Game
Team
CI
Small Releases
Refactoring
Coding standards
Collectve Code Ownership
Simple Design
System metaphor
Suistainable Pace
2003 r.
Principles:
Eliminate Waste
Amplify Learning
Decide As Late As Possible
Deliver As Fast As Possible
Empower The Team
Build Integrity It
See The Whole
Practices
Value Stream Mapping
Waste detection
Pull System
Measurements
Therory Of Constraints
2007-2010 r.
Individuals and interactions over processes and tools
Working software over comprehensive documentation
Customer collaboration over contract negotiation
Responding to change over following a plan
Agile Manifesto
2001 r.
Better Success Rate
~60% - 80%
Agile Projects
In SDP time, cost and scope is something to measure and optimize
List of project issues
Issues are prioritized and ordered
Higher order = more detailed
Lower Order = less detailed
It's alive! but should be done good enough before start
Established in Sprint 0
Often Groomed = adding, detail, estimate, changing order, done by Dev Team and Product Owner
Dev Team
Stories
Usually speeds up product development by exposing and eliminating waste, eliminating push scheduling and too much WIP (mostly Scrum & Kanban)
Stakeholders
Users, Helpdesk, Management etc...
As a <stakeholder>
I want <what>
so that <why>
Cross-functional
Around 7 + SM + PO
As a
trader
I want
a way to change data resolution
so that
I could see longer trends
4 weeks
4 weeks
4 weeks
4 weeks
4 weeks
Definition Of Done
Planning
Daily Scrum
Demo
Retrospection
15 min meeting and questions for everyone:
What has been done yesterday
What are you going to do today
Are ther any problems
I.N.V.E.S.T.
Independent
Negotiable - changable until in sprint backlog
Valuable
Estimable - enough details
Small
Testable - enough details
Estimations
Sprint Progress Control
Project Progress Control
Team Velocity
Scaling
Problems
Kanban
Where is Waste ?
8h
Sprint Goal
Part 1
Business specs - selecting high priority issues
With Product Owner
Part II
Effort Estimation
Breaking up tasks
4h Demo to StakeHolders
Review of what hase been done
Review of what has not been done
4h Retro
What went ok ?
What whent wrong ?
How to improve that ?
Visualise Workflow
Limit Work In Progress
Measure & Optimize Flow
Kanban is Lean
Evolutionary aproach
Deployed
To Do
Selected
In Progress
Ready
2
3
Start with what is now
Respect existing workflows
Respect existing roles
Apply Continuous Improvement
Pull System
Theory of Constraint
Waste elimination
Decide as Late as Possible
Flow
Explicit Process
Models to Recognize Constraints
Done
To Do
In Progress
Scrum Board
Work Units
Days
1
30
100
500
Burndown Chart
Work Units
Days
1
30
100
500
Work Units
Days
1
30
100
500
Work Units
Days
1
30
100
500
Work Units
Days
1
30
100
500
Work Units
Days
1
30
100
500
Units Of Work Delivered / Sprint
Backlog Esitamation ~ 350
Velocity ~ 25
350/25 =14 Iterations
Work Units
Days
1
14
100
350
13
Planned vs. Delivered
Focus on Actual Velocity Not on Average
Velocity
Sprint
1
2
0
30
10
13
15
...
Unit Tests > x%
Review Done
Acceptance Tests Passed
Technical Debt Minimized
Ward Cunningham Term = Cumulative Effect of Shortcuts:
Not met coding standards
Lack of Unit Test
Architecture shortcuts
Tools Platform problems - lack of integrations
Missing Test Automation
Any Debt Has to be paid someday
Scrum Of Scrum
Divide & Conquer
Team 1
Team 2
PO
Team 3
PO
CPO
We could always provide accurate estimations.
Buhaha
What Influences Estimations
If we knew everything about given task
If we had stable environment
Task Understanding
Complexity
Unexpected tech problems
External Factors - meetings etc.
Skills
Domain Knowledge
Difficult Customer
How much it will cost ?
How long it will take ?
What I will get ?
Time Estimations
Hours - objective measure
Clear communication
Difficult to estimate
Though questions arise
Taking into account only Complexity
Assuming Ideal environment
Communication problems
Ideal Time Estimations
Idea of being Consistently Wrong
I you were wrong you would like to improve right ?
Story Points
Abstract In Nature
Relative size of stories is estimated - Don't Think of Time
Technique: Planning Poker and Fibonacci Series
Velocity: Story Points per Sprint - this takes into account common variations
The most important Scrum metric.
Variations
Team & Project Type Specific Metric
Lies Everywhere
Simula Research Lab Trials
Req Specifications
Includes misleading info
Irrelevant info
Badly written specs
Trial:
Group A: text of one page
Group B: same text on 7 pages
Group A Estimation: 117
Group B Estimation: 173
Trial:
Group A: original specs
Group B: same spec + irrelevant info
Group A Estimation: 117
Group B Estimation: 173
Trial:
Group A: req R1 - R4
Group B: req R1 - R5
Group C: req R1 - R5 but should not take into account R5
Group A and B Estimation: 4
Group C Estimation: 8
Trial:
Case A: Customer said 1000
Case B: Customer said 50
Case A esitmations: 555
Case B estimations: 99
No info: 456
Defects
Bad Specs
Process & Tools
Iterations !
Estimations !
Increased WIP = lower quality + higher lead times
Value Stream Mapping Technique
Kanban Board
Product Backlog
Cadences
Daily Meeting
What is not prescribed
Pull System
WIP Limits
Scaling
Project Control
Classes Of Service
CFD diagrams
Control Chart
x
y
Lead Time
Cycle Time
Visual ! - Kanban Board
How Long ?
Little's Law
Number of items in backlog
x
cycle time
Focus on items on board
Scrum focuses on People
How it's going ?
Are there any problems ?
Cross functional teams
Specific roles
Iterations
Meeting types
Estimations !
Team size limits
Requires a lot of discipline
Divide & Conquer
Deployed
To Do
Selected
In Progress
Ready
2
3
Done
In Progress
ToDo
Team Specific
Project Type Specific
Deployed
To Do
In progress
In Review
2
2
week 1
week 2
week 3
week 4
week 5
Planning & triage
Release
(transition costs)
Retrospectives
Can be changed anytime
Does not have to be there at all :)
QA
2
Release when You have something valuable ! - not in sprint timebox
Issue does not have to fit into sprint !
When x be ready ?
Check Lead Time
Certain types of issues must be addressed according to strickt SLA
Deployed
To Do
In progress
In Review
2
2
QA
2
Deployed
To Do
In progress
In Review
1
1
QA
1
Deployed
To Do
In progress
In Review
5
5
QA
5
Deployed
To Do
In progress
In Review
2
2
QA
2
Idle ?
Lock = Where is problem ?
Reduce limit to react faster
Deployed
To Do
In progress
In Review
2
2
QA
2
Swimlanes
QA
TDD
Tests
Code Review
BDD
Partially implemented functionality in sprint
?
We have 2 points free in this sprint
?
Test ? - we do that at the end of sprint
?
Ok we are doomed - we need to do overtime
?
Ok we are doomed - we need to move end of sprint
?
AA - we have critical defect on production
?
We have to have better velocity !
?
Story points inflation
Hmm - lets skip some tests
He have ne member of a team what to do with Velocity
?
Velocity is not stable
?
No Slack
Should we count defects to velocity
?
Deployed
To Do
In progress
In Review
2
2
QA
2
Deployed
To Do
In progress
In Review
2
2
QA
2
Deployed
To Do
In progress
In Review
2
2
QA
2
Deployed
To Do
In progress
In Review
2
2
QA
2
Deployed
To Do
In progress
In Review
2
2
QA
2
Deployed
To Do
In progress
In Review
2
2
QA
2
Use Kanban and Lean
Do what suites your organization and process
Possibility to find defects fast
Improved code readability
Education of junior programmers
Knowledge sharing
Lower risk
More maintainable code
Formal Inspections
Lightweight Review
Add Hoc
Planning
Overview
Meeting
Rework
Follow-UP
Inspection
Meeting
Verification
Meeting
Over the Shoulder Review
Pair Programming
Patch & e-mail pass-around
HP Trial: Only 4 of the 21 would be caught by QA
How to Review
Checklist review
Use-Case review
Systematic review
Atlassian
Reading TIme
#Defects
Code Size
23%
35%
8%
Review for at most one hour
Slow down code readings
Ommissions are hard to find
Big Brother Effect
"stt".equals
Changesets
VCS vs. DVCS
Metrics
Defect Count
Inspection Rate
Defect Rate
Defect Density
Pros
Cons
time/cost
But take ROI into account
examination of SDP artifact to evaluate technical content
Tests
Code & Design
Full transcript