User:Mkretz: Difference between revisions

From KDE TechBase
(removed Thomas' comment)
 
(4 intermediate revisions by the same user not shown)
Line 1: Line 1:
== getting #includes right ==
== audio devices: use-cases ==
=== Carl: a power-user on the move ===
Carl uses his laptop for private use and work. He works at home, while
travelling on a train, or at the office. He uses the following hardware:
# built-in HDA soundcard with jacks for headphones and a microphone, the laptop contains builtin stereo speakers and a surround speaker option in the mixer.
# a USB headset: simple usb-audio device with stereo playback and one  microphone. It also has two buttons to increase/decrease the volume, sending  the volume media keys keyboard events
# a monitor with built-in USB soundcard (usb-audio device which also contains a  mixer device): stereo speakers, built-in microphone. Playback to the monitor  speakers is also possible through HDMI (actually DisplayPort, but Linux  reports that as HDMI). In addition the monitor also has a built-in webcam,  which is attached with the same USB plug.
# A HiFi setup using a Cinch-3.5mm cable to connect its line-in to the laptop's  headphone jack. The speakers are arranged such that the balance must be  adjusted slightly to the right for a centered stereo sound where the  laptop user sits.
# standard 3.5mm jack headphones (for use in the train, and sometimes also for  use at the desk at home or at work)
# Alesis io|2 USB pro-audio soundcard. Carl uses this devices at home (or when  he's working as sound-engineer) to record stuff. His favorite tool for this  job is Ardour.
# At home Carl sometimes also connects his laptop to his TV and can use HDMI to  use the stereo speakers of the TV for audio output.


There are two types of #include statements: <tt>#include <foo.h></tt> and <tt>#include "foo.h"</tt>.
When Carl does home-office he uses Skype, a SIP application, and a H.323
application (e.g. Ekiga for both SIP and H.323) to provide VoIP connectivity to
his collegues and be reachable via a phone number that is not his private phone
number. To provide the best sound quality he wants to use his USB headset when
it's connected. If the headset is not connected he wants to be able to answer
calls with the built-in speakers and microphone. When he then connects the
headset, while in a call, he wants the sound to migrate automatically to the
headset.


Say we have the file <tt>xyz.h</tt> in <tt>/usr/include/mylib/</tt> that contains the following:
When at work, he wants to use the microphone and speakers of the monitor for
<code cpp n>
VoIP applications. If he plugs a headphone in the 3.5mm jack of the laptop, he'd
#include <header1.h>
like to use that as the output device. On unplugging the headphone jack the
#include "header2.h"
output should migrate to the monitor again.
</code>


The preprocessor will search for the file <tt>header1.h</tt> in all the paths given as <tt>-I</tt> arguments and then replace the line with the contents of that file.
Event sounds (this includes ringing sounds of VoIP applications when being
called) should go to
* <em>at work:</em> to the monitor speakers, unless a headphone is plugged in, in which case it should go to both the headphones and the monitor speakers
* <em>at home:</em> to the internal sound card/headphone jack and, if plugged in, additionally to the USB headset


For line 2 the preprocessor tries to use the file /usr/include/mylib/header2.h first and if it does not exist search for the file like it did for <tt>header1.h</tt>. The important part to note here is that the preprocessor does not look in the directory of the source file that includes <tt>xyz.h</tt> but in the directory where <tt>xyz.h</tt> resides.
Sometimes Carl uses his laptop to play a short round of Wesnoth against his friend, while talking to him on Skype. Thus he uses the USB headset for communication and wants Wesnoth to use the USB headset for sound output as well. Since the Wesnoth action is not always very high ;) Carl also starts up Amarok in the background and wants it to play to the headset, too. This requires to have the Skype output louder than the Amarok and Wesnoth "noise", because talking to his friend is most important to him.


Now, which include statement is the one to use? After all you can specify every directory you want using <tt>-I</tt> and thus could use <tt>#include <...></tt> everywhere.
=== Ami: a desktop system in the living room ===
Ami has her desktop computer on a desk in the living room. The internal HDA
soundcard is connected to the monitor speakers via the 3.5mm front output jack
and to high quality active speakers via the 3.5mm back output jack. She also has
headphones with 3.5mm connectors which she can plug into either the headphone
jack of the desktop or the headphone jack of the monitor. A USB webcam with
built-in microphone is attached, providing the only microphone of this system,
for use with Skype. The nearby TV is attached to the graphics card via an HDMI
cable and can provide stereo audio output.


=== as an application developer ===
For most of her time at the computer, she only requires event sounds and
* Include headers from '''external''' libraries using '''angle brackets'''.
the audio of web videos on her monitor speakers.
<code cpp>
If she wants to switch to higher quality playback, she turns on the active
#include <iostream>
speakers and migrates the music and video audio to the active speakers while
#include <QtCore/QDate>
event sounds stay on the monitor speakers.
#include <zlib.h>
To watch a DVD or some of her videos she uses the TV and either wants to use the
</code>
TV speakers or the active speakers.
* Include headers from your '''own project''' using '''double quotes'''.
<code cpp>
#include "myclass.h"
</code>
Rationale: ''The header files of external libraries are obviously not in the same directory as your source files. So you need to use angle brackets.''


''Headers of your own application have a defined relative location to the source files of your application. Using KDE4's cmake macros your source directory is the first include switch to the compiler and therefore there's no difference in using angle brackets or double quotes. If you work with a different buildsystem that does not include the current source directory or disable CMAKE_INCLUDE_CURRENT_DIR then all includes (inside your application) using angle brackets will break.''
When using Skype, she wants to capture from the webcam and use the monitor speakers. But sometimes she'd rather move over to the couch and TV for a longer chat.


''Ideally the buildsystem would not need to specify <tt>-I<source directory></tt> though as that can break with library headers that have the same filename as a header of your project (i.e.: If a library has the header file <tt>foo.h</tt> and your project has a different file with the same filename the compiler will always pick the header from your project instead of the one from the library because the source directory of the project is specified first.)''
== Implications for the GUI ==
Generally the system should be as smart as possible and provide the best defaults possible to minimize configuration tasks. What the computer can't recognize:
* what kind of device is used when a 3.5mm connector is plugged into the headphone (or any line out) jack
* ?


=== as a library developer ===
The following special events are possibly interesting for a smart audio device management to monitor and handle:
* Include headers from '''external''' libraries using '''angle brackets'''.
* move video window from one monitor to another
<code cpp>
* reconfigure monitor setup (e.g. switch from monitor to TV output)
#include <iostream>
* when a Skype/SIP/H.323 call is active and who is on the other side (friend vs. business contact)
#include <QtCore/QDate>
* incoming VoIP call
#include <zlib.h>
</code>
* Include headers of your '''own library''' and libraries that belong to it using '''double quotes'''.
<code cpp>
#include "xyz.h" // same library and same directory
</code>


Rationale: ''The header files of external libraries are obviously not in a fixed location relative to your source files. So you need to use angle brackets.''
That leaves the following setup changes to manual intervention:
 
* switch between headphone pan centered and slightly adjusted to the right
''Headers of your own libraries have a fixed relative location in the filesystem. Therefore you ''can'' use double quotes. You should use double quotes because otherwise the include statement could include a different header file than expected. An example how angle brackets can break the build:''
* turn playback to different output jacks on/off (this might be configured as two or three setups which can then be recalled somehow)
 
''<tt>/usr/include/libxyz/xyz.h</tt> includes <tt>foo.h</tt> using angle brackets and expects to have it replaced with the contents of the file <tt>/usr/include/libzyx/foo.h</tt>. Assuming there's another library that also ships a <tt>foo.h</tt> file in the directory <tt>/usr/include/anotherlib/</tt>. If the application that uses both libraries compiles with "<tt>g++ -I/usr/include/libxyz -I/usr/include/anotherlib ...</tt>" libxyz will work as expected. If the application compiles with "<tt>g++ -I/usr/include/anotherlib -I/usr/include/libxyz ...</tt>" the header <tt>xyz.h</tt> will include the file <tt>/usr/include/anotherlib/foo.h</tt> instead of the file that is shipped with libxyz. The same problem can appear if an application has a header file of the same name as a library and specifies <tt>-I./</tt> as the first include directory.''
 
If you use subdirectories for the installed header files you need to have the exact same directory structure for the headers in the source directory. Example: <tt>/usr/include/libfoo/</tt> contains the directory <tt>bar</tt>. In <tt>libfoo</tt> resides the header <tt>header1.h</tt>, in <tt>libfoo/bar</tt> the file <tt>header2.h</tt>. The latter depends on the former so it includes it using<code cpp>#include "../header1.h"</code>If the source directory structure of the library is not the same (in this case: <tt>header2.h</tt> in a subdirectory of the directory where <tt>header1.h</tt> resides) this obviously will break.

Latest revision as of 15:48, 10 August 2011

audio devices: use-cases

Carl: a power-user on the move

Carl uses his laptop for private use and work. He works at home, while travelling on a train, or at the office. He uses the following hardware:

  1. built-in HDA soundcard with jacks for headphones and a microphone, the laptop contains builtin stereo speakers and a surround speaker option in the mixer.
  2. a USB headset: simple usb-audio device with stereo playback and one microphone. It also has two buttons to increase/decrease the volume, sending the volume media keys keyboard events
  3. a monitor with built-in USB soundcard (usb-audio device which also contains a mixer device): stereo speakers, built-in microphone. Playback to the monitor speakers is also possible through HDMI (actually DisplayPort, but Linux reports that as HDMI). In addition the monitor also has a built-in webcam, which is attached with the same USB plug.
  4. A HiFi setup using a Cinch-3.5mm cable to connect its line-in to the laptop's headphone jack. The speakers are arranged such that the balance must be adjusted slightly to the right for a centered stereo sound where the laptop user sits.
  5. standard 3.5mm jack headphones (for use in the train, and sometimes also for use at the desk at home or at work)
  6. Alesis io|2 USB pro-audio soundcard. Carl uses this devices at home (or when he's working as sound-engineer) to record stuff. His favorite tool for this job is Ardour.
  7. At home Carl sometimes also connects his laptop to his TV and can use HDMI to use the stereo speakers of the TV for audio output.

When Carl does home-office he uses Skype, a SIP application, and a H.323 application (e.g. Ekiga for both SIP and H.323) to provide VoIP connectivity to his collegues and be reachable via a phone number that is not his private phone number. To provide the best sound quality he wants to use his USB headset when it's connected. If the headset is not connected he wants to be able to answer calls with the built-in speakers and microphone. When he then connects the headset, while in a call, he wants the sound to migrate automatically to the headset.

When at work, he wants to use the microphone and speakers of the monitor for VoIP applications. If he plugs a headphone in the 3.5mm jack of the laptop, he'd like to use that as the output device. On unplugging the headphone jack the output should migrate to the monitor again.

Event sounds (this includes ringing sounds of VoIP applications when being called) should go to

  • at work: to the monitor speakers, unless a headphone is plugged in, in which case it should go to both the headphones and the monitor speakers
  • at home: to the internal sound card/headphone jack and, if plugged in, additionally to the USB headset

Sometimes Carl uses his laptop to play a short round of Wesnoth against his friend, while talking to him on Skype. Thus he uses the USB headset for communication and wants Wesnoth to use the USB headset for sound output as well. Since the Wesnoth action is not always very high ;) Carl also starts up Amarok in the background and wants it to play to the headset, too. This requires to have the Skype output louder than the Amarok and Wesnoth "noise", because talking to his friend is most important to him.

Ami: a desktop system in the living room

Ami has her desktop computer on a desk in the living room. The internal HDA soundcard is connected to the monitor speakers via the 3.5mm front output jack and to high quality active speakers via the 3.5mm back output jack. She also has headphones with 3.5mm connectors which she can plug into either the headphone jack of the desktop or the headphone jack of the monitor. A USB webcam with built-in microphone is attached, providing the only microphone of this system, for use with Skype. The nearby TV is attached to the graphics card via an HDMI cable and can provide stereo audio output.

For most of her time at the computer, she only requires event sounds and the audio of web videos on her monitor speakers. If she wants to switch to higher quality playback, she turns on the active speakers and migrates the music and video audio to the active speakers while event sounds stay on the monitor speakers. To watch a DVD or some of her videos she uses the TV and either wants to use the TV speakers or the active speakers.

When using Skype, she wants to capture from the webcam and use the monitor speakers. But sometimes she'd rather move over to the couch and TV for a longer chat.

Implications for the GUI

Generally the system should be as smart as possible and provide the best defaults possible to minimize configuration tasks. What the computer can't recognize:

  • what kind of device is used when a 3.5mm connector is plugged into the headphone (or any line out) jack
  • ?

The following special events are possibly interesting for a smart audio device management to monitor and handle:

  • move video window from one monitor to another
  • reconfigure monitor setup (e.g. switch from monitor to TV output)
  • when a Skype/SIP/H.323 call is active and who is on the other side (friend vs. business contact)
  • incoming VoIP call

That leaves the following setup changes to manual intervention:

  • switch between headphone pan centered and slightly adjusted to the right
  • turn playback to different output jacks on/off (this might be configured as two or three setups which can then be recalled somehow)