Skip to content

User story : Sanity check

Marwane Kalam-Alami edited this page May 25, 2012 · 6 revisions
<<< Back to [[Final soa model design]]

Summary

Introduction

  • Role involved: SOA Architect, Administrator, Developer
  • Goals
    • Get a map of the dependencies between services
    • Be able to know when the services I use but have no control over change (XXX1: What about the other way?)
    • Be able to ensure the quality of services by running tests
  • Features
    • Cartography of the SOA: dependency relations with external services
    • Web services validation
      • Various types of validation
        • Availability
        • Unit tests (request/response)
        • WSDL validation
      • Notifications/reports when validation fails
      • Validation scheduling

User story

Service exchanges recording

As a SOA Administrator/Architect...

  1. GIVEN that I have a complete model of my services (and the services I use) in EasySOA
  • AND I need to gather exchanges to set up request/response validation tests
  • WHEN I use the service registry to browse to the service I want to test
  • AND I start the client scaffolder
  • THEN a client is generated for the service
  1. GIVEN that I have started a scaffolded client for my service
  • WHEN I make one ore more requests
  • AND I choose to store them
  • THEN they will be kept for later use (for validation tests)

Monitoring case:

  1. GIVEN that a probe/proxy has been set on my web service
  • WHEN a requests is made to this service/through the proxy
  • THEN the request is stored for later use (for validation tests)

Validation configuration

As a SOA Administrator...

  1. GIVEN that I want to set up some validation for my web services
  • WHEN I open the validation tool
  • THEN I am invited to configure a new validation configuration
  1. GIVEN that I am on the validation configuration screen
  • WHEN I choose some web services to target (e.g. an external "Weather" service)
  • AND I choose the validation features to enable (e.g. availability check + request/response tests + WSDL consistency over time test)
  • AND I configure them (request/response tests : choose the set of requests to use, WSDL consistency : choose the validation stricness)
  • THEN A new validation configuration is created and made available
  1. GIVEN that I want to test my newly created validation configuration
  • WHEN I manually trigger the validation
  • THEN the validation is run
  • AND a validation report is produced
  1. GIVEN that I want to schedule this validation configuration to run regularly
  • WHEN I configure the schedule
  • AND I configure the notifications to trigger (send what? to who? up to which frequency?)
  • THEN the validation scheduling is set up

Notifications triggering

  1. GIVEN that a scheduled validation is run on a certain webservice
  • WHEN that service fails during validation
  • AND a report is produced
  • THEN a notification is sent to all users chosen during validation configuration

Problems resolution

As a developer...

  1. GIVEN that a service A failed during WSDL consistency validation
  • AND I have been notified as the developer of the service B that requires it
  • WHEN I fix and redeploy my service B and make sure everything works again
  • AND reset the WSDL consistency test for service A
  • THEN the validation for service A works again, and uses the new WSDL to test consistency

As an architect...

  1. GIVEN that an external service failed during availability validation
  • AND I have been notified
  • WHEN I contact the provider of that service to solve this issue
  • AND the service is back to normal
  • AND I run a manual validation of that service
  • THEN the service passes again the validation

UI requirements

  • Services browsing
  • Scaffolded service client
  • Services validation
    • Service selection
    • Validators selection & configuration
    • Validation scheduling
    • Notifications configuration
    • Validation reports & mail notifications
  • Validation dashboard (view the state of the services)

Model requirements

We need to be able to store:

  • Web service exchanges (outside the document model?)
  • Services (their names, URLs, the URLs of their EasySOA proxy(ies))
    • Service validation configurations & schedules

XXXs: Missing information

  • XXX1: When my own web services change in production, do we need the capability to notify users of these services?

Draft: AXXX - sanity check of consumed soa

As an SOA architect or operator,

  • I want to compare how my own unchanged SOA is linked with and behaves with an external, unknown SOA to be sanitized, so that I can detect changes in link and behaviour
    • (sanity check ("validation") dashboard ; in 0.4, see wiki doc)
  • I want to be notified when a service definition (WSDL) changes on an external provider's endpoint, so I can ask its provider about it (in nuxeo or not) or patch my code (possibly in a proxy)

As a test maintainer,

  • I want to be notified when a functional test involving a given external endpoint fails...

=> after auto sanity check, notify (send a link to the report saved in nuxeo to) of errors (to be found in the report) interested people (model : link to owners of endpoint or tests...), jenkins-like

As an SOA architect or operator,

  • I want to schedule an auto sanity check, so level of SOA sanity will be maintained over time
    • => UI to control web discovery recording done by dedicated filter
Clone this wiki locally