Introduction
Police and other emergency services in South-Africa require reasonably secure communication, in order to discourage the abuse of the information transmitted for personal gain or criminal activities.
Traditionally police use their own frequencies (sometimes with an unusual modulation for the frequency) to ensure privacy. Integrated wide-band capable radio receiver chips has made these analogue measures ineffective. This could be expected, since it is based on a "security through obscurity" approach, which relies on no one figuring out the frequency or obtaining a radio, which is impossible to prevent in the long term. Scanners are available that allow anyone in possession of one to receive a wide range of radio signals, including police communication. In addition someone skilled in electronics can build/modify their own radio capable of receiving a wide range of frequencies.
Legislation often exist banning the possession, sale and/or use of scanners, but this does not prevent people from obtaining/building them. Scanners do not transmit significant signals, which make them hard to detect. (Tempest-like methods, as possibly used for the detection of pirate TV viewers for the enforcement of TV license legislation, might work from nearby.) When used by criminals, who is probably committing crimes far more serious than owning / using a scanner, the scanner is can be used for actively avoiding the police, in which case the benefits far outweighs the risks.
Digital communication can solve these problems by utilizing encryption. With proper error-correction codes, the range of the radios can also be extended. The design of such a system can be based on a conditional access system, such as those used for satellite TV. While I'm not familiar with any specific system, it is not hard to design a system based on a mixture of public key and symmetric encryption systems that would be relatively secure.
The basics
Using a purely public-key system is not practical, since radios need to keep track of the public keys of all radios that they want to communicate with. A symmetric-key method is a much more practical solution. It has the problem that only a single key need to be compromised in order to gain access to all encrypted communication. In order to ensure that a compromised key or device do not grant an attacker long term access to the encrypted information, it is critical that the key changes often (at least a few times a day). For a pair of devices, securely exchanging keys is not really a problem (Diffie-Hellman key exchange can be used). This is where the public-key system comes in.
Key distribution
The keys are much smaller than the data protected by them. This makes it practical to distribute a large number of copies of the keys, each encrypted with the public key of the recipient. The recipient needs a private key in order to decrypt the symmetric-cypher's keys. It is preferable that these keys are stored in the hardware used to decrypt the data. A smart card is the ideal vehicle for such a purpose (provided that the key is stored securely and not easily exposed or cloned). Several keys, each marked with the time they come into effect need to be distributed, since the radios may be out of range of the key distribution system for some time (This reduces security somewhat by increasing the time that a stolen device / smart-card is useful). The encrypted keys can be transferred to the devices using several methods such as in-band distribution, central distribution from the police station, GSM modems, etc.
Security of the system
The system is kept secure by strictly keeping track of the smart cards in use. If a smart-card is lost or stolen, the distribution of keys encrypted with its public key is simply discontinued. Strict procedures, such as weekly / daily automated audits of the smart cards should be used. (The system should require the smart card to be physically present to verify its identity.)
Weaknesses and countermeasures
The basic weaknesses in the system is similar to those in pay-TV systems: It is possible to modify a device to share the decrypted keys or the hardware used to decrypt the keys with other devices. This can result in unauthorized devices gaining access to the encrypted signal.
The hardware decryption device can be modified to allow a limited amount of decryption operations in a certain time-frame in an attempt to limit the risk. Hardware modification or caching of the decrypted keys can bypass such measures.
The difficulty of properly managing devices that go missing increase with the number of devices. The system should therefore be managed on a relatively small scale, such as within a city.
Conclusion
Regulatory methods of controlling access to private radio signals are ineffective and only provide for reactive management of the security of the system by destroying unauthorized receivers if they are found.
It is possible to provide a much higher level of security to communication within an exclusive group of users, such as a police department or the subscribers of a pay-TV system. This method also allow for central manageability and access revocation.
Monday, October 27, 2008
Monday, September 8, 2008
HTTPS should not require special treatment by browsers
Google Chrome was recently criticized for indexing information transferred from HTTPS pages such as Internet banking.
While the concerns, private information being indexed is valid, the best solution is not to exclude HTTPS from indexing by default. Many useful sites are served over HTTPS that do not contain private data. Users benefit from having this information easily available. Some browsers even exclude HTTPS from caching by default.
The HTTP standard specifies a much better way to ensure that certain data is excluded from caching (it is probably a good idea to exclude it from indexing as well in such cases).
The HTTP 1.1 standard states "Unless specifically constrained by a cache-control directive, a caching system MAY always store a successful response as a cache entry..."
Unless a response from a server is specifically marked not to be cacheable, any browser (or proxy for normal HTTP) should try to cache the response in order to improve the user experience.
How sensitive data should be protected
Even though caches improve the user experience, some data should never be stored. The data mentioned in the linked articles fall within that category. This data is usually transferred over HTTPS in order to ensure its privacy and integrity while being transported between the server and user.
HTTP 1.1 provides a mechanism in order to ensure that this data is protected at the end points (and caches for normal HTTP). It specifies a "Cache-Control" header. This header allows the data to be tagged with several levels of cacheability. Anything marked with anything other than a no-store Cache-Control header should be expected to be cached at least in a limited way by the browser. (Other headers are indented to ensure that a cache do not return out of date data and no to ensure its privacy on the user's computer)
Most browsers since the days of Internet Explorer 4 supports enough HTTP 1.1 to understand Cache-Control headers.
Banks and other sites should therefore ensure that they include the correct headers in the responses from their severs. They should not prevent non-sensitive content such as static style-sheets, scripts and images from being cached, since reloading this data each time degrades the user experience and wastes bandwidth. Depending on browsers to be more paranoid than the standards require them to be is irresponsible.
If sensitive data leaks, the party responsible for the disclosure should be held responsible. This can be the user, if his/her system's security was breached (due to his/her negligence), the browser vendor, if the browser does not follow the standards and caches data that is marked no-store, or the party serving the content if they do not mark their content properly.
Interoperability with HTTP 1.0
HTTP 1.0 do not provide the Cache-Control header. In most such cases a Pragma: no-cache over HTTPS should be enough to exclude the page totally from caching. (This seems to be the common behaviour) When HTTP 1.1 is used, the finer-grained Cache-Control header should be used, if present. (HTTP 1.0-like behaviour as fall back in its absence is probably a safe option)
Deja Vu
The outcry over Chrome indexing page transferred over HTTPS reminds of the reaction after Google started indexing pages hosted on HTTPS in 2002.
An article written then sums it up nicely: "The misconception that Google is going where it shouldn't comes partly from the somewhat vague definition of "secure." The SSL protocol is simply a transmission protocol. It has nothing to do with whether an individual page should be considered "secure" or not."
While the concerns, private information being indexed is valid, the best solution is not to exclude HTTPS from indexing by default. Many useful sites are served over HTTPS that do not contain private data. Users benefit from having this information easily available. Some browsers even exclude HTTPS from caching by default.
The HTTP standard specifies a much better way to ensure that certain data is excluded from caching (it is probably a good idea to exclude it from indexing as well in such cases).
The HTTP 1.1 standard states "Unless specifically constrained by a cache-control directive, a caching system MAY always store a successful response as a cache entry..."
Unless a response from a server is specifically marked not to be cacheable, any browser (or proxy for normal HTTP) should try to cache the response in order to improve the user experience.
How sensitive data should be protected
Even though caches improve the user experience, some data should never be stored. The data mentioned in the linked articles fall within that category. This data is usually transferred over HTTPS in order to ensure its privacy and integrity while being transported between the server and user.
HTTP 1.1 provides a mechanism in order to ensure that this data is protected at the end points (and caches for normal HTTP). It specifies a "Cache-Control" header. This header allows the data to be tagged with several levels of cacheability. Anything marked with anything other than a no-store Cache-Control header should be expected to be cached at least in a limited way by the browser. (Other headers are indented to ensure that a cache do not return out of date data and no to ensure its privacy on the user's computer)
Most browsers since the days of Internet Explorer 4 supports enough HTTP 1.1 to understand Cache-Control headers.
Banks and other sites should therefore ensure that they include the correct headers in the responses from their severs. They should not prevent non-sensitive content such as static style-sheets, scripts and images from being cached, since reloading this data each time degrades the user experience and wastes bandwidth. Depending on browsers to be more paranoid than the standards require them to be is irresponsible.
If sensitive data leaks, the party responsible for the disclosure should be held responsible. This can be the user, if his/her system's security was breached (due to his/her negligence), the browser vendor, if the browser does not follow the standards and caches data that is marked no-store, or the party serving the content if they do not mark their content properly.
Interoperability with HTTP 1.0
HTTP 1.0 do not provide the Cache-Control header. In most such cases a Pragma: no-cache over HTTPS should be enough to exclude the page totally from caching. (This seems to be the common behaviour) When HTTP 1.1 is used, the finer-grained Cache-Control header should be used, if present. (HTTP 1.0-like behaviour as fall back in its absence is probably a safe option)
Deja Vu
The outcry over Chrome indexing page transferred over HTTPS reminds of the reaction after Google started indexing pages hosted on HTTPS in 2002.
An article written then sums it up nicely: "The misconception that Google is going where it shouldn't comes partly from the somewhat vague definition of "secure." The SSL protocol is simply a transmission protocol. It has nothing to do with whether an individual page should be considered "secure" or not."
Labels:
browsers,
cache,
cache-control,
chrome,
firefox,
google chrome,
http,
https,
mozilla,
mozilla firefox,
privacy,
security,
standards,
web standards
Monday, June 23, 2008
Disclaimer
The contents of this blog and my website is my opinion and not that of my employer.
The accuracy of information posted on this blog and website cannot be guaranteed. I do not take any responsibility for any damage or loss sustained as a result of the information posted here. You are responsible to check the accuracy thereof. If instructions are provided, users should verify the safety and legality of following such instructions.
The accuracy of information posted on this blog and website cannot be guaranteed. I do not take any responsibility for any damage or loss sustained as a result of the information posted here. You are responsible to check the accuracy thereof. If instructions are provided, users should verify the safety and legality of following such instructions.
Monday, April 28, 2008
Why there is no worse time to buy a TV in ZA
Looking to buy a new television? Now might be the worse possible time to buy a TV.
Worldwide countries are switching to digital TV broadcasts. Digital broadcasts are set to begin 1 November 2008. This has several advantages over current analog TV broadcasts. A few standards exist for digital broadcasts, such as the DVB, ATSC, ISDB and DMB families of standards. DVB seems to be the most widely used standard.
Advantages of digital TV: (Mostly from common sense and Wikipedia article)
Existing analog televisions will not be able to receive digital broadcasts, but this will not affect them immediately. Analog broadcasts are planned to continue alongside digital broadcasts until 1 November 2011. However, when the analog transmitters are switched off (in order to allow reuse of the spectrum for other services) your existing TV and VCR will be useless without a special set-top box (STB) (similar to current DSTV decoders) This means that you will need to buy a separate set-top box for every TV / video machine if you want to be able to watch/record TV on more than one channel (each set-top box will be able to receive a single channel that will be selected with its own remote).
Most of the televisions available in South-Africa at the moment are only capable of receiving analog signals, which means that if you do not buy additional equipment you will be unable to use your new TV in three years' time. Of course things such as consoles and existing set-top boxes will continue working.
It is hard to find information on the standards selected for digital broadcasting in South-Africa, but DVB-T and Eureka 147 (for radio) seem likely based on government's switchover site. In the case of radio however it was decided that FM broadcasts are to continue indefinitely.
If you want to buy a new TV/VCR it seems like a good idea to wait till November in order to receive confirmation of the selected standard and to see how digital broadcasting performs.
Government should have done more to inform the public about the switchover and should have banned the sale of analog televisions without warning that they will be useless in three years' time.
Additional links:
Speech on switchover
Update (2008-09-10): I have received confirmation that some form of DVB-T will be used, with a MPEG-4 codec. The SABC might want to force a conditional access (a form of DRM) system, which would mean that even with a digital TV, you still need a STB with all its disadvantages. If a STB is required the only advantage of a digital TV above a analog one is likely to be better picture quality. The "security features" in the STB is probably related to a conditional access system.
Media coverage of the migration process has improved as well.
It looks like the 1 November switchon date is likely to be postponed.
Worldwide countries are switching to digital TV broadcasts. Digital broadcasts are set to begin 1 November 2008. This has several advantages over current analog TV broadcasts. A few standards exist for digital broadcasts, such as the DVB, ATSC, ISDB and DMB families of standards. DVB seems to be the most widely used standard.
Advantages of digital TV: (Mostly from common sense and Wikipedia article)
- No snowy pictures, ever
- Uses less bandwidth (more channels in same amount of spectrum)
- HD broadcasts possible
- Additional features possible (such as optional subtitles, multiple sound tracks, electronic program guides, etc.)
- Better picture and sound quality (depending on amount of compression used)
- In the case of very bad reception, no image will be seen, rather than a snowy image (However the image will remain perfect far longer than an analog signal of the same power)
- New equipment needed to receive digital signals
- The use of too much compression can degrade image quality
Existing analog televisions will not be able to receive digital broadcasts, but this will not affect them immediately. Analog broadcasts are planned to continue alongside digital broadcasts until 1 November 2011. However, when the analog transmitters are switched off (in order to allow reuse of the spectrum for other services) your existing TV and VCR will be useless without a special set-top box (STB) (similar to current DSTV decoders) This means that you will need to buy a separate set-top box for every TV / video machine if you want to be able to watch/record TV on more than one channel (each set-top box will be able to receive a single channel that will be selected with its own remote).
Most of the televisions available in South-Africa at the moment are only capable of receiving analog signals, which means that if you do not buy additional equipment you will be unable to use your new TV in three years' time. Of course things such as consoles and existing set-top boxes will continue working.
It is hard to find information on the standards selected for digital broadcasting in South-Africa, but DVB-T and Eureka 147 (for radio) seem likely based on government's switchover site. In the case of radio however it was decided that FM broadcasts are to continue indefinitely.
If you want to buy a new TV/VCR it seems like a good idea to wait till November in order to receive confirmation of the selected standard and to see how digital broadcasting performs.
Government should have done more to inform the public about the switchover and should have banned the sale of analog televisions without warning that they will be useless in three years' time.
Additional links:
Speech on switchover
Update (2008-09-10): I have received confirmation that some form of DVB-T will be used, with a MPEG-4 codec. The SABC might want to force a conditional access (a form of DRM) system, which would mean that even with a digital TV, you still need a STB with all its disadvantages. If a STB is required the only advantage of a digital TV above a analog one is likely to be better picture quality. The "security features" in the STB is probably related to a conditional access system.
Media coverage of the migration process has improved as well.
It looks like the 1 November switchon date is likely to be postponed.
Labels:
broadcasting,
digital television,
digitalmigration,
drm,
DTV,
dvb,
dvb-t,
stb,
switchover
Sunday, April 13, 2008
Integrating Facebook with other sites
Facebook has a great feature that allows easy integration with several other sites: Facebook badges.
This handy feature allows you to display your photo and several other details from Facebook in a convenient way. Unfortunately this feature is not easy to find if you do not know that it exists. I found it from Facebook's help under "profile". It can also be found from a tiny link at the bottom of your own profile page.
Currently, the following information can be listed:
This handy feature allows you to display your photo and several other details from Facebook in a convenient way. Unfortunately this feature is not easy to find if you do not know that it exists. I found it from Facebook's help under "profile". It can also be found from a tiny link at the bottom of your own profile page.
Currently, the following information can be listed:
- Profile picture
- Email address
- Name
- Networks
- IM Screen names
- Birthday
- Telephone numbers
- Websites
- Status updates
- Recent pictures
- Upcoming events
- Latest notes
- Posted items
- Image
- JavaScript
Saturday, February 2, 2008
Using SFU/SUA to replace Cygwin?
Microsoft's Services for Unix / Subsystem for Unix-based Applications (SUA) might make an excellent replacement for Cygwin.
Since SFU/SUA interacts directly with the Windows kernel, it should be several times faster than Cygwin. This might make SFU an excellent platform for the porting of *nix applications to Windows.
With a good package manager, SFU can be the Windows equivalent of Fink (which is an excellent product). It should be able to compile MOST *nix applications with a few minor modifications.
Several package managers have SFU/SUA ports underway, including Gentoo and Debian. An easy way to install those should dramatically increase their use as.
SUA one other major advantage above Cygwin: It is included with Windows 2003 R2 and Windows Vista (Enterprise and Utimate)
Since SFU/SUA interacts directly with the Windows kernel, it should be several times faster than Cygwin. This might make SFU an excellent platform for the porting of *nix applications to Windows.
With a good package manager, SFU can be the Windows equivalent of Fink (which is an excellent product). It should be able to compile MOST *nix applications with a few minor modifications.
Several package managers have SFU/SUA ports underway, including Gentoo and Debian. An easy way to install those should dramatically increase their use as.
SUA one other major advantage above Cygwin: It is included with Windows 2003 R2 and Windows Vista (Enterprise and Utimate)
Compiling (relatively) simple programs under SFU
Microsoft Services for Unix (SFU) provides a basic Unix-like environment under Windows and is useful for many things. It allows you to use things like *nix shell scripts and many *nix utilities under Windows.
For the instructions posted below, I assume that you already have a working bash installation. See this post as well as this information (the last link is the newer information) to get a working bash version installed. You may need to disable DEP from boot.ini in order to get bash running.
The steps that I took before attempting to install any packages:
Using the normal "./configure", followed by "make && make install" procedure I have successfully compiled and installed the following applications:
For the instructions posted below, I assume that you already have a working bash installation. See this post as well as this information (the last link is the newer information) to get a working bash version installed. You may need to disable DEP from boot.ini in order to get bash running.
The steps that I took before attempting to install any packages:
- Installed Interop System's installer and bash
- Installed the GNU SDK from the SFU installer
- Installed the following development-related packages: (everything on this list might not be needed)
- autoconf
- automake
- c89
- gmake
- install
- m4-gnu
- Renamed /bin/sh and replaced it with a symbolic link to /usr/local/bash (configure scripts usually crashed with a NT exception on the standard sh (ksh IIRC))
Using the normal "./configure", followed by "make && make install" procedure I have successfully compiled and installed the following applications:
- sarg
- nano
Learning vi
For some reason, probably laziness, I never bother learning one of the classic *nix text editors, vi or emacs. Normally, I simply use nano / pico since there is no need to learn them first, and I usually avoid programming from the command-line.
However for some reason MS Services for Unix's (SFU) version of crontab seems to ignore the EDITOR environment variable (I actually compiled nano from source) and always uses vi as the editor. This forced me to learn vi.
I found a nice little tutorial on using vi which probably a good thing to read if you are using any form of unix.
However for some reason MS Services for Unix's (SFU) version of crontab seems to ignore the EDITOR environment variable (I actually compiled nano from source) and always uses vi as the editor. This forced me to learn vi.
I found a nice little tutorial on using vi which probably a good thing to read if you are using any form of unix.
Friday, February 1, 2008
Running squid on Windows SBS 2003
I have a Windows Small Business Server 2003 Standard SP2 on my network and needed a proxy server in order to somewhat reduce the bandwidth usage.
I decided on squid, since it is free, well-supported and run on Windows. I decided on the SquidNT version, although the version available (free registration required for download) for Microsoft Services for Unix (or Subsystem for Unix Applications in newer Windows versions) should work as well.
SquidNT does not include an installer and is simply extracted to the hard drive (preferably to "c:\"). It requires a bit of configuration before starting. (Simply run "squid.exe" from the command-prompt to attempt to run it) The log file ("C:\squid\var\logs\cache.log") also provides help figuring out why squid would not start. You need to generate cache folders before squid would start. ("squid -z")
Once squid is running the configuration can be tweaked a bit further. (For basic configuration see here; for advanced tweaking see here).
Once configuration is complete, it is a good idea to install it as a service ("squid -i") and set it to start automatically from "Services" under Administrative tools.
For log file analysis I use SARG, combined with the sarg-reports script. (get my configured version here) Since sarg-reports is a unix shell script and expects to be run form a Unix-like environment, I recommend installing Microsoft Services for Unix (SFU). Once SFU is installed, install Interop Systems' package manager. After that, it might be a good idea to install a few packages, install at least bash and mktemp ("pkg_update -L bash" for bash). Certain SFU applications need DEP to be turned off (AlwaysOff mode). This might affect security negatively.
For SARG's installation, I recommend downloading its source code from sourceforge. Extract the source code with the following command: "tar -xzf sarg-2.2.3.1.tar.gz" (replace filename with correct version). Change into the correct directory and run "./configure" followed by "make" and "make install". Sarg's configuration file can be found in "/usr/local/sarg". Edit it to correctly reflect the paths of your log files and web server documents directory. (/dev/fs/C/inetpub/wwwroot for IIS) You might want to creat a symlink from /var/www/html as well (since sarg sometimes ignores its config file).
The next step would be to install sarg-reports and log rotation. Copy sarg-reports to /usr/local/bin (C:\SFU\usr\local\bin from a Windows POV). Make it executable (chmod +x /usr/local/bin/sarg-reports). Edit sarg-reports in a text editor (supporting unix text files, such as notepad2) and modify the first line to read "#!/usr/local/bin/bash". Set any other settings that you wish.
Sarg-reports needs the GNU coreutils version of date to run. Download and install coreutils from here. Copy date.exe, libiconv2.dll and libintl3.dll from "C:\Program Files\GnuWin32\bin" to c:\SFU\common (backup original first). Rename /bin/date and replace it with a symbolic link to /common/date.exe. Attempt to run sarg-reports to ensure that it returns no errors.
Now the cron entries need to be added. Run "crontab -e" and add the following: (followed by Ctrl-D) (all text might not show correctly at a low resolution and high zoom. View source to copy all text)
Run "crontab -p" in order to save your password for use by cron.
SARG should run regularly now.
I decided on squid, since it is free, well-supported and run on Windows. I decided on the SquidNT version, although the version available (free registration required for download) for Microsoft Services for Unix (or Subsystem for Unix Applications in newer Windows versions) should work as well.
SquidNT does not include an installer and is simply extracted to the hard drive (preferably to "c:\"). It requires a bit of configuration before starting. (Simply run "squid.exe" from the command-prompt to attempt to run it) The log file ("C:\squid\var\logs\cache.log") also provides help figuring out why squid would not start. You need to generate cache folders before squid would start. ("squid -z")
Once squid is running the configuration can be tweaked a bit further. (For basic configuration see here; for advanced tweaking see here).
Once configuration is complete, it is a good idea to install it as a service ("squid -i") and set it to start automatically from "Services" under Administrative tools.
For log file analysis I use SARG, combined with the sarg-reports script. (get my configured version here) Since sarg-reports is a unix shell script and expects to be run form a Unix-like environment, I recommend installing Microsoft Services for Unix (SFU). Once SFU is installed, install Interop Systems' package manager. After that, it might be a good idea to install a few packages, install at least bash and mktemp ("pkg_update -L bash" for bash). Certain SFU applications need DEP to be turned off (AlwaysOff mode). This might affect security negatively.
For SARG's installation, I recommend downloading its source code from sourceforge. Extract the source code with the following command: "tar -xzf sarg-2.2.3.1.tar.gz" (replace filename with correct version). Change into the correct directory and run "./configure" followed by "make" and "make install". Sarg's configuration file can be found in "/usr/local/sarg". Edit it to correctly reflect the paths of your log files and web server documents directory. (/dev/fs/C/inetpub/wwwroot for IIS) You might want to creat a symlink from /var/www/html as well (since sarg sometimes ignores its config file).
The next step would be to install sarg-reports and log rotation. Copy sarg-reports to /usr/local/bin (C:\SFU\usr\local\bin from a Windows POV). Make it executable (chmod +x /usr/local/bin/sarg-reports). Edit sarg-reports in a text editor (supporting unix text files, such as notepad2) and modify the first line to read "#!/usr/local/bin/bash". Set any other settings that you wish.
Sarg-reports needs the GNU coreutils version of date to run. Download and install coreutils from here. Copy date.exe, libiconv2.dll and libintl3.dll from "C:\Program Files\GnuWin32\bin" to c:\SFU\common (backup original first). Rename /bin/date and replace it with a symbolic link to /common/date.exe. Attempt to run sarg-reports to ensure that it returns no errors.
Now the cron entries need to be added. Run "crontab -e" and add the following: (followed by Ctrl-D) (all text might not show correctly at a low resolution and high zoom. View source to copy all text)
00 * * * * /usr/local/bin/sarg-reports today
00 00 * * * /usr/local/bin/sarg-reports daily
00 01 * * 1 /usr/local/bin/sarg-reports weekly
30 02 1 * * /usr/local/bin/sarg-reports monthly && /dev/fs/C/squid/sbin/squid.exe -k rotate -n squid
Run "crontab -p" in order to save your password for use by cron.
SARG should run regularly now.
Labels:
cron,
sarg,
servicesforunix,
sfu,
squid,
squidnt,
windows,
windowsserver
Tuesday, January 22, 2008
Hello world!
So I actaully registered my blog on blogger..... It had to happen sometime.
I hope to polute cyberspace with loads of random ideas....
Gert van den Berg
I hope to polute cyberspace with loads of random ideas....
Gert van den Berg
Subscribe to:
Posts (Atom)