KDE PIM/Akonadi/Testing: Difference between revisions

    From KDE TechBase
    m (Text replace - "<syntaxhighlight lang="make">" to "<syntaxhighlight lang="cmake">")
    Line 110: Line 110:
    This can be used from within CMake (example based on ''kdepimlibs/akonadi/tests''):
    This can be used from within CMake (example based on ''kdepimlibs/akonadi/tests''):


    <syntaxhighlight lang="make">
    <syntaxhighlight lang="cmake">
    macro(add_akonadi_isolated_test _source)
    macro(add_akonadi_isolated_test _source)
       set(_test ${_source})
       set(_test ${_source})

    Revision as of 10:20, 30 June 2011

    This page describes the Akonadi test and benchmark facilities.

    Tip
    Further Reading: Akonadi Development Tools.


    Akonadi Testrunner

    Igor's GSoC project, found in kdepimlibs/akonadi/tests/testrunner. The Akonadi Testrunner sets up an isolated Akonadi server (which implies a separated D-Bus server) based on an environment configuration file.

    Creating Testrunner Environments

    A testrunner environment consists of two components: a set of configuration and data files and a XML description file of the environment.

    Here is an example listing based on the environment used for the libakonadi unittests:

    unittestenv/
    unittestenv/config.xml
    unittestenv/xdglocal
    unittestenv/kdehome
    unittestenv/kdehome/share
    unittestenv/kdehome/share/config
    unittestenv/kdehome/share/config/akonadi-firstrunrc
    unittestenv/kdehome/share/config/akonadi_knut_resource_0rc
    unittestenv/kdehome/testdata.xml
    unittestenv/xdgconfig
    

    The environment description file (config.xml)looks like this:

    <config>
      <!--  path to KDE configuration ($KDEHOME) -->
      <kdehome>kdehome</kdehome>
      <!-- path to Akonadi configuration, ie. the stuff that
           usually goes into ~/.config/akonadi/ -->
      <confighome>xdgconfig</confighome>
      <!-- path to Akonadi data, ie. the stuff that usually
           goes into ~/.local/share/akonadi/ -->
      <datahome>xdgdata</datahome>
      <!-- load resources of the specified types -->
      <agent synchronize="true">akonadi_knut_resource</agent>
      <!-- set environment variables -->
      <envvar name="AKONADI_DISABLE_AGENT_AUTOSTART">true</envvar>
    </config>
    

    The first three elements define the relevant paths inside the environment data, relative to the config.xml file. The <agent> element can be used to create instances of the specified agent (multiple such elements are allowed). If the agent is a resource, it can also be synced initially by adding the synchronize="true" attribute. Tests will not be launched before the syncing has been complete in this case.

    Agents set up in this way can be configured by simply providing the corresponding configuration file in $KDEHOME, such as akonadi_knut_resource_0rc in our example.

    Global configuration files can be provided in the same way, akonadi-firstrunrc as shown below is in particular useful to avoid the Akonadi default setup mechanism interfering with the test:

    [ProcessedDefaults]
    defaultaddressbook=done
    defaultcalendar=done
    

    Same for kdedrc which allows to disable kbuilsycoca4. That can greatly speed up tests.

    [General]
    CheckSycoca=false
    CheckFileStamps=false
    

    The <envvar> element allows you to set arbitrary environment variables inside the test environment. One useful example is AKONADI_DISABLE_AGENT_AUTOSTART which will prevent the Akonadi server from starting autostart agents, which can further speed up the setup process.

    Using the Testrunner

    Interactive Use

    For manual usage, the testrunner provides an interactive mode in which it sets up the environment and provides a way to "switch" into it.

    First, start the testrunner:

    $ akonaditest -c config.xml --testenv /path/to/testenvironment.sh &
    

    Note: Although the testenv parameter is not required, it makes life a bit easier when testing manual. If you don't pass it the script will be generated in a temporary dir and is therefore a bit harder to find.

    Once the setup is complete, it creates a shell script containing the necessary environment variable changes to switch into the test environment:

    $ source /path/to/testenvironment.sh
    

    The environment variables of the current shell are then changed to point to the test environment (eg. KDEHOME, DBUS_*, etc.). Every Akonadi application run in that shell operates on the Akonadi server of the test environment.

    To terminate and cleanup the test environment, run:

    $ shutdown-testenvironment
    

    Note that your shell afterwards still points to the (now no longer existing) test environment and might not work as expected anymore.

    Non-Interactive Use

    kdepimlibs/akonadi/tests uses the Akonadi Testrunner to run unittests in an isolated environment. For automated usage, the testrunner can be used in a non-interactive way:

    $ akonaditest -c config.xml <comand> <params>
    

    The testrunner will run command params within the isolated environment and terminate afterwards.

    This can be used from within CMake (example based on kdepimlibs/akonadi/tests):

    macro(add_akonadi_isolated_test _source)
      set(_test ${_source})
      get_filename_component(_name ${_source} NAME_WE)
      kde4_add_executable(${_name} TEST ${_test})
      target_link_libraries(${_name}
        akonadi-kde akonadi-kmime ${QT_QTTEST_LIBRARY}
        ${QT_QTGUI_LIBRARY} ${KDE4_KDECORE_LIBS}
      )
    
      # based on kde4_add_unit_test
      if (WIN32)
        get_target_property( _loc ${_name} LOCATION )
        set(_executable ${_loc}.bat)
      else (WIN32)
        set(_executable ${EXECUTABLE_OUTPUT_PATH}/${_name})
      endif (WIN32)
      if (UNIX)
        set(_executable ${_executable}.shell)
      endif (UNIX)
    
      find_program(_testrunner akonaditest)
    
      add_test( libakonadi-${_name} 
        ${_testrunner} -c
        ${CMAKE_CURRENT_SOURCE_DIR}/unittestenv/config.xml
        ${_executable} 
      )
    endmacro(add_akonadi_isolated_test)
    

    Using QtTest unittests with KDE extensions (QTEST_KDEMAIN) together with the testrunner is problematic as they modify some of the environment variables set by the testrunner. Instead, use the following:

    #include <qtest_akonadi.h>
    
    QTEST_AKONADIMAIN( MyTest, NoGUI )
    

    KNUT Test Data Resource

    In kdepim/akonadi/resources, fully featured resource that operates on a single XML file. File format is decribed in knut.xsd and follows closely the internal structure of Akonadi. New files can be created in eg. Akonadiconsole by creating a resource and specifying a non-existing file.

    Akonadi Benchmarker

    In kdepimlibs/akonadi/test, part of Robert's thesis. It is a set of test that show the time to process many item/collection operations.

    Warning
    This section needs improvements: Please help us to

    cleanup confusing sections and fix sections which contain a todo


    Unittests

    Akonadi Server

    Usable without installation, run with ctest/make test as usual.

    kdepimlbs/akonadi

    These tests use the Akonadi Testrunner, the test environment is found in kdepimlibs/akonadi/tests/unittestenv.

    Setup

    The tests do not yet completely work without having certain components installed, namely:

    • Akonadi Server
    • KNUT resource

    Running the tests

    The tests can be run automatically using ctest/make test as usual. To run a single test manually, it needs to be executed using the Akonadi testrunner:

    $ cd kdepimlibs/akonadi/test
    $ akonaditest -c unittest/config.xml $BUILDDIR/test_executable.shell
    

    kdepim/akonadi

    Are there any?

    Warning
    This section needs improvements: Please help us to

    cleanup confusing sections and fix sections which contain a todo


    Resource Testing

    Tools to automatically test Akonadi resources are currently in development in playground/pim/akonaditest/resourcetester. There are two basic modes of operation, read tests and write tests. The resourcetester tool provides convenience methods for common operations needed to perform those tests.

    Read Tests

    To verify the read code in a resource works correctly we need to read pre-defined test data from the resource and compare that with independently provided reference data.

    Write Tests

    Once the reading code is verified we can use that to verify the writing code. This is done by writing a change to the resource, re-creating it to ensure the change was persistent and finally comparing the re-read change with the expected result.


    Warning
    This section needs improvements: Please help us to

    cleanup confusing sections and fix sections which contain a todo