Friday, December 29, 2006

LiveJournal to Blogger conversion/import tool - lj2blogger

Sept 29, 2007 update - An updated tool (renamed to Blog2Blog)is available see this post for download. Regardless, We will keep this post and the download around for posterities sake. My advice would be to try the current blog2blog tool which has better error handling and updated functionality but the choice is yours.


As unknown surprise to the Blogger staff that picked my blog for the blog of note, I've been working on a tool to import LiveJournal blogs into Blogger. I may as well post this today for all of you incoming viewers. I wish I had something witty and amusing to say in addition, but words fail. I am certain I'll think of something in a couple of days, my mind just works that way -- slow and steady?

Like many good ideas, this one came from my wife who wanted to migrate from LiveJournal to Blogger. I looked around and didn't find a conversion tool after several attempts to Google it. Thus, as a good software developer I spent the time to learn several things and build a tool.

Note, I would say this is a Beta at this point. It works well for the scenarios I've run it through. Please feel free to comment/suggest improvements through the blog comments.

Download lj2blogger (1.5.0 updated Feb 12, 2007)


The above is a .zip with a Windows installer. To install unzip and double click on setup.exe.

After installing there is a small manual:
"LiveJournal2Blogger Manual.rtf" at C:\Program Files\Cooley Computing Inc\lj2blogger\doc (default location).

To save you the time of opening this file it is below.

LiveJournal2Blogger (lj2blogger)

Introduction

This tool migrates a blog from LiveJournal to Blogger. It uses the public API of LiveJournal and the public API of Blogger to extract data from Livejournal and create equivalent entries in a Blogger account with the appropriate date/time of the orignal entry.

Features
- Download Journal Entries from LiveJournal
- A simple read-only viewer of downloaded entries
- Publish Journal Entries to Blogger
* upload private entries (it will make them public)
* parse entries and wrap http:// entries with HTML to make them active links
- Delete ALL entries for a Blogger journal
- Save/Load journal entries to/from file.

Usage

The designed usage pattern is below. For issues please feel free to contact me at pcooley.newsgroups at gmail.com.

Download from LiveJournal
1. Enter username (e.g. user1).
2. Enter password (e.g. pass1).
3. Enter lj-url (the URL of the livejournal you wish to download): http://user1.livejournal.com.
4. (optional) if necessary modify the api-url of your journal. This is autopopulated with a typical value.
5. Press Fetch from LiveJournal Button.

View Entries (optional)
6. Press View Entries Button.
7. Use the Entry Number Dialog to change the entry being viewed.
8. When complete Press the upper right close window (the X).

Save Entries (optional)
1. Press Save Entries
2. Choose the filename, it is an XML file with the extension (*.J2B)
Note: once saved you can load these Entries without downloading from LiveJournal (use the Load Entries button)

Publish to Blogger

9. Enter username (e.g. user1).
10. Enter password (e.g. pass1).
11. Enter lj-url (the URL of the livejournal you wish to download): http://user1.blogspot.com.
12. (optional) if necessary modify the api-url of your journal. This is autopopulated with a typical value. (if you have a non-upgraded blog take note; for non-upgraded blogs you will need to update the api-url! To find the URL you will have to look at the page source of your journal. See below for more information)
13. (optional) select options you please; publish private entries and/or wrap http:// with <a href=”…”>.
14. Press Publish to Blogger Button.
15. Select the Entries you will to publish (defaults to all entries).

Further instructions (non-upgraded blogs):
For non-upgraded blogs you will need to find your feed URI.
-- To do this you will have to use your favorite browser and type in the URL of you blog (e.g. http://user1.blogspot.com)
-- Using your browser you will need to 'View Source'. This differs slightly between browsers.
-- In the source look for link rel="service.post" type="application/atom+xml" title="User1 (Atom 1.0)"
-- The api-url is the http:// location in the href

Additionally it is not possible to post to an old journal with a new 'google' integrated account (for those of you that have only partially upgraded all your journals). Addtionally the publish dates of your livejournal won't be regarded. This is part of the API limitation.


Known issues

• The Application is not multithreaded – The UI refreshes slowly while connected to LiveJournal/Blogger.
• Does not import/export comment
• found 01/02/2007 - defaults for api-url only work with upgraded blogs.
• There are connection problems with the XMLRPC interface to livejournal, but immediately retrying Fetch from Livejournal seems to connect sucessfully. Just try a couple of times. No harm should be done.


It appears today I made the blogs of note on blogger.com Blogs of Note. Incoming hits galore!


Happy Blogging


Keywords: Blog migration tool, LiveJournal to Blogger migration, LiveJournal to Blogger conversion, transfer blog to Blogger, Import blogs from LiveJournal into Blogger, lj2blogger, livejournal2blogger, migrate livejournal blogger.

Thursday, December 28, 2006

Energy Usage of My Computers (Kill-A-Watt)

Getting a fine gift, A Kill-A-Watt device, from my Sister and Brother in-law last year meant I could measure the energy usage of my computer systems. Not necessarily a HOWTO article, but informative for those of us that obsess over numbers and metrics.

The Server Closet contains:
(1) Server (512MB/AMD Athlon(tm) XP 1600+)
(2) D-Link DI-724GU Router
(3) Comcast Cable modem (Motorola SB5120)
(4) Vonage Phone Adapter (Motorola VT1005V)
(5) Oregon Scientific WMR-968 Weather Station
- Note No Monitor

Power Usage:
1.34 Amp
125 Watts
2.95 KWH per day

My Primary System consists of:
(1) User System (1GB/Pentium 4 - 3.00GHz)
(2) Monitor 1 - Dell 2001FP 20" LCD
(3) Monitor 2 - Sony GDM-500PS CRT
(4) Sound - Yamaha Receiver RXV420, KLH Speakers and Subwoofer Model HTA-4906.

Power Usage:
3.06 A
300 - 400 Watts (depending on whether it is rendering in game or relaxing writing a document)
4.30 KWH per day

Our computer room is on a 15 Amp fuse and it now makes sense when my wife's computer system is on and she turns on an electric space heater my resulting trip to the fuse panel. Given the above, I now expect that fuse is probably even being lenient on us, given her system probably uses 3.0 Amps (identical system).

Keywords: computer, power usage, watts, how much power does my computer consume

HOWTO set up NUT on Gentoo Linux for Tripp Lite OMNI1000LCD USB UPS

The below documents my configuration of a Tripp Lite OMNI1000LCD USB UPS to communicate with Gentoo Linux (2.6.17-gentoo-r8 kernel) in my home network using NUT (Network UPS Tools). Note this is much easier in Dec 2007 - see new howto here



Plugged into the UPS :
(1) Linux Server (Older machine with AMD XP processor)
(2) D-Link DI-724GU Router
(3) Comcast Cable modem (Motorola SB5120)
(4) Vontage Phone Adapter (Motorola VT1005V)
(5) Oregon Scientific WMR-968 Weather Station

The average draw of those components is around 125 Watts (you have a Kill-A-Watt device don't you? Kill-A-Watt link). During my two simulated power outages it appeared that this UPS would power those components for ~30 minutes.

This UPS was purchased at Costco, so with that ease of distribution I suspect there might be more of you out there that may want to do this, so I'll write down my steps of configuration for Linux/Gentoo. Truth be told, maybe the key feature of the UPS verses others? The pretty Blue LCD display panel; even my wife attests to it aesthetic value.

Configuration took the better part of a day because, I didn't realize/admit the 2.0.4 version of NUT didn't support my Tripp Lite USB UPS. Once I use the development tree of NUT it was much easier. I hope to save you that time.

1. Ensure your kernel has hid support compiled into it

- in 'make menuconfig' select the '/dev/hiddev raw HID device support'

Device Drivers --->
USB support --->
<*> Support for Host-side USB
[*] HID input layer support
[ ] Force feedback support (EXPERIMENTAL)
[*] /dev/hiddev raw HID device support

2. emerge 'sys-power/nut'

Guess what? This, as of January 1, 2007, does not have the support in it for the Tripp Lite USB models (nut 2.0.4-r1 in portage). Following this, you will have to fetch the development trunk from the fine people on the NUT development team and create a Portage overlay. From this trunk, it runs great. I spent significant time starting the newhidups driver with the response No matching USB/HID UPS found and it appears 20 googles. It just doesn't work ;)

Note: Everything I know about Portage overlays I learned while watching a 'Cold Case' episode, so it isn't too hard. Kudos to portage.

3. Get the development trunk from Subversion. Don't have subversion in Gentoo? Me neither, but a simple 'emerge subversion', got me a running version.

mkdir ~/src/nut
cd ~/src/nut
svn co svn://svn.debian.org/nut/trunk

This will get the latest source for nut in a directory called trunk in your home directory ~/src/nut

4. Create a Portage Overlay for the development trunk source.

We will want to do this so that portage knows what we are up-to. Admittedly you can go straight for the compilation of the nut source and figure out all the configuration necessary to get it to run in Gentoo, but it was already all figured out for sys-power/nut-2.0.4.r1 so lets use that.

References:
HOWTO Create an Updated Ebuild
HOWTO Install 3rd Party Ebuilds (slightly less relevant in our case.)

a. Package the source:
in dir ~/src/nut
cp -r trunk nut-2.1.0
note: we are making up a nut version that follows the portage convention.
tar -cf nut-2.1.0.tar nut-2.1.0
gzip nut-2.1.0.tar

b. copy the source into the portage distribution tree
cp nut-2.1.0.tar.gz /usr/portage/distfiles/

c. Create an overlay directory and add it to your make.conf (I am presuming you have root like powers aren't I? Keep this in mind)
mkdir -p /usr/local/portage && echo 'PORTDIR_OVERLAY="/usr/local/portage"' >> /etc/make.conf

d. copy the existing ebuild for nut into your new Portage Overlay directory

make the dir: mkdir -p /usr/local/portage/sys-power/nut
copy it: cp -r /usr/portage/sys-power/nut /usr/local/portage/sys-power/nut

e. create the new ebuild file (copying from the latest ebuild).
cd /usr/local/portage/sys-power/nut
cp nut-2.0.4-r1.ebuild nut-2.1.0.ebuild
note: remember to name this the same as tar file above.

f. remove the ebuilds you don't need (I like clean)
for me it was: rm nut-2.0*

g. edit the ebuild. I removed a patch that probably isn't necessary in the development trunk.
nano -w /usr/local/portage/sys-power/nut/nut-2.1.0.ebuild

I removed line 46 that started with 'epatch'

5. Manually step thru the emerge steps with ebuild (slow and cautious). Ebuild is a lower level tool that emerge uses.
Note: I am only going to compile the driver I need, it isn't a default driver so I need to use the NUT_DRIVERS directive to ebuild system. Additionally when I didn't do this, one of the drivers in the development tree had a compile time error.

a. The digest:
NUT_DRIVERS="newhidups" ebuild /usr/local/portage/sys-power/nut/nut-2.1.0.ebuild digest

b. The unpacking (the unzips the tar file you created earlier):
NUT_DRIVERS="newhidups" ebuild /usr/local/portage/category/program/program-version.ebuild unpack

c. The compile
NUT_DRIVERS="newhidups" ebuild /usr/local/portage/category/program/program-version.ebuild compile

d. The installation
NUT_DRIVERS="newhidups" ebuild /usr/local/portage/category/program/program-version.ebuild install

Without the NUT_DRIVERS declaration I got:
/bin/sh ../libtool --tag=CC --mode=link i686-pc-linux-gnu-gcc -I../include -O2 -march=athlon-xp -pipe -Wall -Wsign-compare -o blazer blazer.o ../common/libcommon.a ../common/upsconf.o
../common/parseconf.o
../common/state.o main.o dstate.o serial.o
i686-pc-linux-gnu-gcc -I../include -O2 -march=athlon-xp -pipe -Wall -Wsign-compare -o blazer blazer.o ../common/upsconf.o ../common/parseconf.o
../common/state.o main.o dstate.o serial.o ../common/libcommon.a
if i686-pc-linux-gnu-gcc -DHAVE_CONFIG_H -I. -I. -I../include -I../include -O2 -march=athlon-xp -pipe -Wall -Wsign-compare -MT cpsups.o -MD -MP -MF ".deps/cpsups.Tpo" -c -o cpsups.o cpsups.c; \
then mv -f ".deps/cpsups.Tpo" ".deps/cpsups.Po"; else rm -f ".deps/cpsups.Tpo"; exit 1; fi
cpsups.c: In function 'clr_cps_serial':
cpsups.c:110: error: 'TIOCM_DTR' undeclared (first use in this function)
cpsups.c:110: error: (Each undeclared identifier is reported only once
cpsups.c :110: error: for each function it appears in.)
cpsups.c:112: warning: implicit declaration of function 'ioctl'
cpsups.c:112: error: 'TIOCMBIC' undeclared (first use in this function)
cpsups.c: In function 'set_cps_serial':
cpsups.c:117: error: 'TIOCM_DTR' undeclared (first use in this function)
cpsups.c:119: error: 'TIOCMBIS' undeclared (first use in this function)

6. Configure NUT

Now that we've made this build, you can follow the steps at the wiki:
Gentoo HOWTO NUT (Network UPS Tools)

You'll get to start after the Install Software section. Yes, you'll need the newhidups driver. I am pasting in the article with a couple of mods for my UPS posterity/consistency.

I added the bit about adding a MONITOR line to uspmon.conf. This is necessary for upsmon to work

Configuring

Go to /etc/nut . Open ups.conf. Add to this file every UPS you want to monitor. Mine looks like this:

# [powerware]
# driver = bcmxcp
# port = /dev/ttyS0
# desc = "Server, adsl, 3com"


Change the values to something that fit's your configuration better and save. If you have a USB connection to your UPS, your entry might look like

[OMNI1000]
driver = newhidups
port = auto
desc = "Tripp Lite OMNI1000LCD USB"

Open upsd.conf. This file contains access-policy to the UPS's you have.

To only allow the same computer to connect to them, the file should look like this:

ACL all 0.0.0.0/0
ACL localhost 127.0.0.1

ACCEPT localhost
REJECT all


The ACL-lines are used to add hosts. The syntax is ACL name IP. If the name is placed after ACCEPT connections from there are accepted, if it's placed under REJECT, connections are rejected. This sort of reminds me of a hosts.allow file?

Next open upsd.users. This file contains accounts for users who can make modifications. The following line grants access to the user server to everything. This isn't integrated with particular logins from my observation. So you can make-up a new user/pass and run with that. The only place one needs to remember this if you are setting properties through NUT on the UPS.


[server]
password = changeme
allowfrom = localhost
actions = SET
instcmds = ALL
upsmon master


Next take a look at upsmon.conf. This is the UPS Monitor configuration. A scan through this file is going to answer more questions than me writing about it. It is pretty simple, but the below three changes are all I needed to make.

  1. in Gentoo we compiled the nut package using the 'nut' user, so ensure this is in there.
    RUN_AS_USER nut
  2. The UPS to monitor
    MONITOR OMNI1000@localhost 1 server changeme master
  3. for those of us with a measly single UPS for our home network you'll need this line:
    MINSUPPLIES 1


Next if you want look at upssched.conf too, it may be interesting if you want to schedule events. This will be useful if you want to automate something like "30 seconds after the power is out, send an email via SMTP." I currently am not interested in this

Finishing

Now, start the upsd and upsmon service:

/etc/init.d/upsd start
/etc/init.d/upsmon start

This should identify if there is some still wrong.
If nothing is wrong add them to the default runlevel

rc-update add upsd default
rc-update add upsmon default

If you didn't get any error when you started the services you seemed to configure it right. Else, have a look at the error output and think what might be wrong. One error I got was the permissions of the serialport, which is easy fixed with adding nut to the tty-group.

Testing

You'll be using tools upsc and upscmd

Test upsc and see if your UPS replies:

upsc yourupsname@yourupshost ups.status
specifically: upsc OMNI1000@localhost ups.status

If you get "OL" (On Line) everything is working good (if it is on the line and not battery).

To see every command your UPS supports, type

upscmd -l OMNI1000@localhost

A full status looks like the below.

#upsc OMNI1000@localhost
battery.charge: 100
battery.type: PbAc
battery.voltage: 13.4
battery.voltage.nominal: 12.0
driver.name: newhidups
driver.parameter.port: auto
driver.version: 2.1.0
driver.version.data: TrippLite HID 0.1 (experimental)
driver.version.internal: 0.30
input.frequency: 59.8
input.voltage: 117.5
input.voltage.nominal: 120
output.frequency.nominal: 60
output.voltage.nominal: 120
ups.beeper.status: enabled
ups.delay.reboot: 65535
ups.delay.shutdown: 65535
ups.mfr: Tripp Lite
ups.model: TRIPP LITE UPS
ups.power.nominal: 1000
ups.serial: 692195 B
ups.status: OL CHRG



7. Customization of and understanding UPSMON.

It is upsmon that will be responsible for a system shutdown when the battery level reaches a critical level. It is here that if you want to do magic like emails/etc, you'll want to look.


There you have it in 7 easy (or not so easy?) steps, it should be configured.

UPS/NUT References that will help.
Network UPS Tools Home
Nut-upsuser Mailing List Note: from my lurking on this mailing list, it is filled with people very helpful and with an abundance of experience with NUT.

Keywords: Tripp Lite, USB UPS, Tripp Lite Gentoo, Tripp Lite OMNI1000LCD, USB HID UPS Linux, Gentoo HID UPS, gentoo NUT portage overlay, Gentoo UPS NUT, Tripp Lite Linux, Tripp Lite OMNI1000LCD Gentoo, Tripp Lite OMNI1000LCD Linux, Tripp Lite OMNI1000LCD UPS.

Monday, December 11, 2006

HOWTO remove blocked packages in portage on gentoo

On my routine of weekly updates to my gentoo installation I often encounter blocked packages in response to the command 'emerge --update --deep --ask world'

[blocks B ] mail-mta/ssmtp (is blocking mail-mta/postfix-2.2.2-r1)

This, until I became familiar with it, seemed confusing. My analysis, the order of this response is what tripped me up in combination with the temptation of speed reading. My interpretation was 'blocks mail-mta/ssmtp. Is blocking mail-mta/postfix-2.2.2-r1?' The parenthesis caused me to consider this as a separate thought. It is not and is my fault for confusing myself; And aside, I'd love for the parenthesis to be removed. That interpretive linguistics analysis aside on to the point.

Looking at the statement mail-mta/ssmtp is blocking mail-mta/postfix. The simple thing to do is to say, I don't need ssmtp let me remove it. The portage command:

emerge --unmerge 'mail-mta/smtp'

In summary it is the first item listed that is the 'blocking' package, that is what you want to remove.

Then try again with 'emerge --update --deep --ask world'

Of course the removal of something needed is a bad idea, so be careful.

An introduction to Portage: link
A simple portage Wiki page: Portage and Ebuilds
The man page: Portage Man page
The "best-known practices" for working with Portage: HOWTO_Use_Portage_Correctly
keywords: gentoo remove blocked package, portage remove blocked package, gentoo blocks, portage blocks.

Wednesday, November 29, 2006

Patterns and Practices of Software Development

It appears to be low profile so I hadn't heard of this until a co-worker pointed out this MS Wiki where they are putting together a community to work on documenting patterns and practices.

Microsoft's Patterns and Practices Guidance Library


This is useful in two ways, (1) in defining some templates for creating better documentation. For instance there is a template for writing a Mini How To. It lists a number of good ideas to consider while writing a How To. (2) It provides documentation, an example How To: How To: Use Regular Expressions to Constrain Input in ASP.NET. Another example is they have checklists for instance they have a Web Services Security Checklist.

From my observation there are not necessarily 'deep' thoughts here. The statements all make sense, to the point that it seems like stating the obvious. The value is in that the thought have been persisted, collected and present in a clean and concise way. This means that in our haste to get a product out the door, we can walk through a relatively simple list to verify we didn't miss something obvious.

A good idea. I hope it continues to have contributions and a community is built around this.

Tuesday, November 28, 2006

HOWTO Forge (linux)

A friend of mine sent me this link. This is really my blog, on steroids and then some. A community of people making howtos for Linux. Smart. I wish I thought of it :-).

http://www.howtoforge.org/ - HOWTO Forge

I am definitely adding this to my resources. Maybe I should start to contribute.

Monday, November 27, 2006

HOWTO use microsoft's logparser to analyze IIS logs with example sql/code

Logparser can be your good friend if you have a large set of data (text form or otherwise) and you would like to summarize it. It can be used to analyze Microsoft’s Internet Information Server (IIS) logfiles, text based logfiles, XML files, Eventviewer data, Registry, Active Directory Objects, CSVs and more (see all the input formats at the end of blog entry).

The below is my documenting a howto use logparser with a number of examples. Most of the examples of IIS log parsing were not developed by me, rather there is a MS team that can be employed to do an IIS health check, these were the logparser SQLs they used.

Logparser to start

I recommend become familiar with:
logparser -h
In all truth all my needs have been answered in the command-line help. I may have googling for a solution, but the problem was solvable with careful reading.

Logparser and IIS logs.

Logpaser automatically reads the IIS header. In fact, I highly suspect that the reason for the tool’s existence began with the need to analyze IIS logs - the history and lore, I have not taken that much time to learn. I'll let you correct me?

Queries (examples):
updated March 2007 to add reverse DNS lookup, Referer URLs (sic), and Referer Summary (sic).
• Merge Multiple Log files
To consolidate log files into a single file.
logparser -o:IIS "select * into merged.log from ex*.log"
• A count of the Total Requests
logparser "select count(*) into IISLOG_TOTAL_REQ.csv from ex061023.log"
• How many unique clients
logparser "select count(distinct c-ip) into IISLOG_DISTINCT_CLIENTS.csv from ex061023.log"
• Top 20 URLs Hit
logparser "SELECT TOP 20 cs-uri-stem, COUNT(*) AS Hits INTO Analysis.csv from ex061023.log group by cs-uri-stem order by Hits DESC"
• Top 20 ASP pages Hit
logparser "SELECT TOP 20 cs-uri-stem, COUNT(*) AS Hits INTO Analysis.csv from ex061023.log where cs-uri-stem like '%%.asp' group by cs-uri-stem order by Hits DESC"
• Hit Frequency (how many hits per hour)
logparser "SELECT TO_LOCALTIME(QUANTIZE(TO_TIMESTAMP(date, time),3600)), COUNT(*) AS Hit_Frequency INTO IISLOG_ANALYSIS_HIT_FREQ.CSV FROM ex061023.log GROUP BY TO_LOCALTIME(QUANTIZE(TO_TIMESTAMP(date, time),3600)) ORDER BY TO_LOCALTIME(QUANTIZE(TO_TIMESTAMP(date, time),3600)) ASC"
• Bytes per Extension
What is the percentage of the bytes served per extension-type?
logparser "SELECT EXTRACT_EXTENSION(cs-uri-stem) AS Extension, MUL(PROPSUM(sc-bytes),100.0) AS PercentOfTotalBytes INTO IISLOG_ANALYSIS_BYTES_PER_EXT.CSV FROM ex061023.log GROUP BY Extension ORDER BY PercentOfTotalBytes DESC"
• Top 20 Clients Hitting this server
logparser "SELECT top 20 c-ip AS Client_IP,count(c-ip) AS PageCount from ex061023.log to IISLOG_ANALYSIS_TOP20_CLIENT_IP.CSV GROUP BY c-ip ORDER BY count(c-ip) DESC"
• REVERSEDNS of Top 20 Clients Hitting this server (reversedns(...) is a long running function for obvious reasons)
logparser "SELECT top 20 c-ip AS Client_IP, REVERSEDNS(c-ip),count(c-ip) AS PageCount from ex061023.log to IISLOG_ANALYSIS_TOP20_CLIENT_IP_WITH_DNS.CSV GROUP BY c-ip ORDER BY count(c-ip) DESC"
• Referrer Host Names directing traffic to this server with count of pages referred (summary)
logparser "SELECT ReferringHost, count(*) AS TotalReferrals, Min(cs(Referer)) AS ExampleRefererURL USING CASE EXTRACT_TOKEN(cs(Referer),2, '/') WHEN null THEN 'NoReferer' ELSE EXTRACT_TOKEN(cs(Referer),2, '/') END as ReferringHost into IISLOG_ANALYSIS_REFERER_HOSTS.CSV FROM ex061023.log group by ReferringHost order by count(*) DESC"
• Referrer URLs directing traffic to this server (full report)
logparser "SELECT EXTRACT_TOKEN(cs(Referer),2, '/') as RefererHostName, cs(Referer) AS RefererURL, count(cs(Referer)) AS TotalReferrals into IISLOG_ANALYSIS_REFERERURLs.CSV FROM ex061023.log group by cs(Referer) order by count(cs(Referer)) DESC"
• Unique Clients per Hour
This is two separate SQLs.
1. logparser -o:CSV "Select TO_LOCALTIME(QUANTIZE(TO_TIMESTAMP(date, time),3600)) as Times, c-ip as ClientIP into IISLOG_ANALYSIS_DIST_CLIENT_IP.LOG from ex061023.log group by Times, ClientIP"
2. logparser -i:CSV "Select Times, count(*) as Count from IISLOG_ANALYSIS_DIST_CLIENT_IP.LOG to IISLOG_ANALYSIS_HOURLY_UNIQUE_CIP.CSV group by Times order by Times ASC"
• IIS Errors and URL Stem (Error code > 400)
logparser "SELECT cs-uri-stem, sc-status,sc-win32-status,COUNT(cs-uri-stem) from ex061023.log to IISLOG_ANALYSIS_ERROR_COUNT.CSV where sc-status>=400 GROUP BY cs-uri-stem,sc-status,sc-win32-status ORDER BY COUNT(cs-uri-stem) DESC"
• IIS Errors by hour (Error code > 500)
Can answer if the errors are load related
logparser "SELECT TO_LOCALTIME(QUANTIZE(TO_TIMESTAMP(date, time),3600)), COUNT(*) AS Error_Frequency FROM ex061023.log TO IISLOG_ANALYSIS_ERROR_FREQ.CSV WHERE sc-status >= 500 GROUP BY TO_LOCALTIME(QUANTIZE(TO_TIMESTAMP(date, time),3600)) ORDER BY TO_LOCALTIME(QUANTIZE(TO_TIMESTAMP(date, time),3600)) ASC"
• Status Code distribution
logparser "SELECT sc-status, COUNT(*) AS Times from ex061023.log to IISLOG_ANALYSIS_STATUS_CODE.CSV GROUP BY sc-status ORDER BY Times DESC"
• Top 20 Longest time-taken (on average) pages
logparser "SELECT top 20 cs-uri-stem,count(cs-uri-stem) As Count,avg(sc-bytes) as sc-bytes,max(time-taken) as Max,min(time-taken) as Min,avg(time-taken) as Avg from ex061023.log to IISLOG_ANALYSIS_TOP20_AVG_LONGEST.CSV GROUP BY cs-uri-stem ORDER BY avg(time-taken) DESC"
• Top 50 longest requests
logparser "SELECT top 50 TO_LOWERCASE(cs-uri-stem),time,sc-bytes,time-taken INTO IISLOG_ANALYSIS_TOP50_LONGEST.CSV FROM ex061023.log ORDER BY time-taken DESC"
• Average Response time by Hour
logparser "SELECT TO_LOCALTIME(QUANTIZE(TO_TIMESTAMP(date, time),3600)), avg(time-taken) INTO IISLOG_ANALYSIS_AVG_RESP_TIME.CSV FROM ex061023.log WHERE cs-uri-stem like '%%.asp' GROUP BY TO_LOCALTIME(QUANTIZE(TO_TIMESTAMP(date, time),3600)) ORDER BY TO_LOCALTIME(QUANTIZE(TO_TIMESTAMP(date, time),3600)) ASC"
• Percentage Processing time by extension
logparser "SELECT EXTRACT_EXTENSION(cs-uri-stem) AS Extension, MUL(PROPSUM(time-taken),100.0) AS Processing_Time INTO IISLOG_ANALYSIS_PROCTIME_PER_EXT.CSV FROM ex061023.log GROUP BY Extension ORDER BY Processing_Time DESC"

As an added bonus, I’ve created a small cmd (windows) shell script that runs thru all (but the first) of these queries below against a log file. It is located at the following link
download it
Note it requires logparser on the path and has a commandline invocation of:
logparseranalysis.cmd ex061023.log

Logparser and creating separate SQL files (the file: argument)

You may have noticed that these SQLs can get a long, as is the way with SQL. Logparser provides the means to create a text file with these long sqls in it. Additionally the ability to pass arguments is of course a given. Next, an example is in order. To use the commandline below you will need to create a little text file (extension sql) with the contents of the below.

Command Line:
logparser file:iis.sql?logfile=ex061113.log

Text file: iis.sql
-- Start of SQL file --
SELECT
c-ip AS ClientIP,
cs-host AS HostName,
cs-uri-stem AS URIStem,
sc-status AS Status,
cs(User-Agent) AS UserAgent,
count (*) as Requests
INTO output.csv
FROM %logfile%
where time > to_timestamp('18:20:00', 'hh:mm:ss') and time < to_timestamp('18:45:00', 'hh:mm:ss') GROUP BY c-ip, cs-uri-stem, cs-host, cs(User-Agent), sc-status ORDER BY Requests DESC
-- End of SQL file --


Logparser and the files without headers

Don’t have a header in your csv file? With a little work we can define a logparser SQL that will map the empty fields to names with meaning. The automatic header row parsing will need to be turned off.
Command Line:
logparser -i:csv -headerRow:OFF file:dslog.sql?logfile=logwoutheader.log+outputfile=out.csv
Text file: log.sql
-- Start of SQL file --
select To_TimeStamp(MyDate, MyTime) as DateTime, field3 as MachineNane, field4 as PID, field5 as TID, To_Int(field6) as ErrorLevel, field7 as RegExp, field8 as Line, field9 as SID, field12 as Message using TO_TIMESTAMP(field1,'MM/dd/yyyy') as MyDate, TO_TIMESTAMP(field2, 'hh:mm:ss.lx') as MyTime into %OUTPUTFILE% from %LOGFILE% where ErrorLevel >= 35
-- End of SQL file --


Logparser and the eventviewer
Although already covered in a previous article, logparser can also connect to eventviewer and analyze those logs. It can even do this on remote machines. The below SQL is an example on how to detect locked out accounts.

Command Line:
logparser file:lockedoutaccounts.sql?DOMAINCONTROLER=HQDC01C

Text file: lockedoutaccounts.sql
-- Start of SQL file --
SELECT timegenerated AS TimeLockedout,
extract_token(strings, 0, '|') As UserName ,
extract_token(strings, 1, '|') AS OriginatingMachine,
EventID,
SourceName,
Message,
CASE EventID
WHEN 529 THEN 'Invalid userid/password'
WHEN 531 Then 'Account disabled out'
WHEN 539 Then 'Account locked out'
WHEN 530 Then 'Outside of logon time'
WHEN 532 THEN 'Account Expired'
WHEN 535 THEN 'Password Expired'
WHEN 533 THEN 'User not from allowed system'
WHEN 644 THEN 'Account Auto Locked'
WHEN 540 THEN 'Successful logon'
ELSE 'Not specified' END AS EventDesc,
strings
INTO lockedact.csv
FROM \\%DOMAINCONTROLER%\Security
WHERE EventID=644
-- End of SQL file --


Reference Material:
http://www.logparser.com/ the unofficial logparser site. It hosts a great knowledge base and an active forum.

How logparser 2.2 Works

To download: Download Logparser

Logparser Blog Entry

Logparser 2.2 Input formats:
• IISW3C: This is the IIS W3C Extended log file format.
• IIS: This is the IIS log file format.
• IISMSID: This is the log format for files generated by IIS when the MSIDFILT filter or the CLOGFILT filter is installed.
• NCSA: This is the IIS NCSA Common log file format.
• ODBC: This is the IIS ODBC format, which sends log files to an ODBC-compliant database.
• BIN: This is the IIS binary log file format.
• URLSCAN: This is the format for URLScan logs.
• HTTPERR: This is the IIS 6.0 HTTP error log file format.
• EVT: This is the Microsoft Windows Event Messages format.
• TEXTWORD: This is a generic text file, where the TEXT value is any separate word.
• TEXTLINE: This is a generic text file, where the TEXT value is any separate line.
• CSV: This is a comma-separated list of values.
• W3C: This is a generic W3C log file, such as a log generated by Windows Media Services or Personal Firewall.
• FS: This provides information about file and directory properties.
• XML: Reads XML files (requires the Microsoft® XML Parser (MSXML)) •
• TSV: Reads tab- and space- separated values text files
• •ADS: Reads information from Active Directory objects
• REG: Reads information from the Windows Registry
• NETMON: Makes it possible to parse NetMon .cap capture files
• ETW: Reads Event Tracing for Windows log files and live sessions
Logparser 2.2 Output formats:
• W3C: This format sends results to a text file that contains headers and values that are separated by spaces.
• IIS: This format sends results to a text file with values separated by commas and spaces.
• SQL: This format sends results to a SQL table.
• CSV: This format sends results to a text file. Values are separated by commas and optional tab spaces.
• XML: This format sends results to an XML-formatted text file.
• Template: This format sends results to a text file formatted according to a user-specified template.
• Native: This format is intended for viewing results on screen.
• CHART: Creates chart image files (requires Microsoft Office 2000 or later)
• TSV: Writes tab- and space- separated values text files
• SYSLOG: Sends information to a SYSLOG server or to a SYSLOG-formatted text file


Keywords: IIS log file analysis, IIS 6.0, IIS 5.0, IIS, logparser, logparser examples, logparser samples, logparser input formats, logparser output formats, logparser examples, howto use logparser, example sqls for logparser, how to use logparser, Analyzing IIS logs with logparser, logparser and files without headers, logparser eventviewer example, using logparser to analyze IIS logfiles, logparser sample code.

Sunday, November 26, 2006

How to add a DiggIt and Del.icio.us links/button to blogger

Do you want to make it easy have your blog be submitted to the social networking sites like Digg and Del.icio.us?

You can add the submission URLs to the Comment footer of each of your blogger posts.

Below is how to do this in both the new and old version of Blogger.

Heads Up Folks: I've recently updated a simplier way to do this at this new post here.

NEWEST BLOGGER

Thanks to Sabre for looking and finding this. As he writes in his blog SabreNews: HOWTO add "diggit" and "del.icio.us" links to blogger( not beta , but latest version)

1.check whether email post page element is added to your blog .
if not, you can add thru template --->page elements page in your blog admin page

2.after adding email-post , search for in the edit html page of Templage.
do not forget to check "Expand Widget Templates"

3.paste the following snippet before tag
::for DIGGIT
<a expr:href='"http://digg.com/submit?phase=2&url=" +
data:post.url + "&title=" + data:post.title'
target='_blank'>DiggIt!</a>
::for DEL.ICIO.US
&lt;a expr:href='"http://del.icio.us/post?url=" +
data:post.url + "&title=" + data:post.title'
target='_blank'>Del.icio.us</a>



OLD BLOGGER


(1) Goto your blogger Template.
(2) Search for the BlogItemCommentsEnabled section.
(3) Add the submission URLs following the template:
digg: http://digg.com/submit?phase=2&url=www.UniqueURL.com&
title=StoryTitle&bodytext=StoryDescription&topic=YourSelectedTopic
del.icio.us:
http://del.icio.us/post?url=www.UniqueURL.com&title=StoryTitle

The below is the example of the BlogItemCommentsEnabled section I am using.

<BlogItemCommentsEnabled><br><a href="http://digg.com/submit?phase=3&url=<$BlogItemPermalinkUrl$>&title=<$BlogItemTitle$>" Title="Submit To Digg" target="_blank"&gtDiggIt!</a> | <a href="http://del.icio.us/post?url=<$BlogItemPermalinkUrl$>&title=<$BlogItemTitle$>" Title="Del.icio.us" target="_blank">Del.icio.us</a> | <a href="<$BlogItemCommentCreate$>"
<$BlogItemCommentFormOnclick$>>
<$BlogItemCommentCount$> ;comments</a>



The blog I found this information in is:
Technology Wrap: Guide: How to add a DiggIt and Del.icio.us button to blogger

Keywords: Blogger, Submit URL links, Digg link, Diggit link, Del.icio.us link, Digg submit link, Diggit submit link, Del.icio.us submit link, HOWTO add Digg submit links to blogger, HOWTO add delicious submit links to blogger.

Wednesday, November 15, 2006

Paste Special Unformatted Text At Your Fingertips

This one is a pet peeves of Paul, I admit that sometime Rich Text is really a good thing. In our world of Windows Rich Text, there are times when I want to paste rich text and many more times I want to paste Text-only. My wife actually influenced me into realizing the need is great enough to research it. I've used the Paste Special command for a long time, but have been troubled by the lack of a short-cut key to it.

This article (link below) is a great howto create a macro to a shortcut key for MS Word. Which these days (Office 2003) translates directly to our Outlook 2003 Email Client. If you create this macro in Word, you can use it in composing your Outlook emails.

Paste Special Unformatted Text At Your Fingertips

The next extension would be to create a Windows wide Paste Special shortcut key. Sure, I'll add it to my project list ;)

Update: I won't yet bother with the Windows Wide implementation. It is done:
PureText - http://stevemiller.net/PureText/, a quick trial seems to suggest it works well.

A quote from the web-page:
Have you ever copied some text from a web page or a document and then wanted to paste it as simple text into another application without getting all the formatting from the original source? PureText makes this simple by adding a new Windows hot-key (default is WINDOWS+V) that allows you to paste text to any application without formatting.

After running PureText.exe, you will see a "PT" tray icon appear near the clock on your task bar. You can click on this icon to remove formatting from the text that is currently on the clipboard. You can right-click on the icon to display a menu with more options.



Keywords: Paste Special keyboard shortcut, paste special, paste unformatted text, paste unformatted text shortcut key, unformatted text paste, unformatted paste, Windows, Word, Outlook, Windows Wide unformatted paste.

Integer Types In C and C++

Unless I am switching back and forth between compilers often, I tend to need a kick (or look up) to recall the different implementations of integer types.

The below page of Jack Klein's is just that kick.

Integer Types In C and C++


The introduction, a copy-paste from the above site, and the sample program to run through your compiler is below.

Introduction

You would think that the basic integer types provided by the C and C++ languages wouldn't cause an much confusion as they do. Almost every day there are posts in the C and C++ newsgroups which show that many newcomers do not understand them. Some experienced programmers who are only familiar with one platform do not understand them either.

The most common source of confusion are the sizes of the integer types, and the range of values which they can hold. That is because the languages leave many features of the integer types implementation-defined, meaning that it is up to the particular compiler to determine their exact specifications. C and C++ do set minimum requirements for each of the integer types, but the compiler is free to exceed these limits.

Each compiler is required to document its implementation. This information should be available in the printed manuals, online help, or man pages which come with the compiler.

In addition, there is a required standard header named <limits.h> (&ltclimits> in newer C++ compilers) that provides information about the integer types that can be used in your programs at run time. A compiler is not required to provide a header like <limits.h> as a readable text file, but I do not know of any compilers which do not.

There are programs on this page to display the information that this file contains.



A Program To Display Integer Type Information Standard C++ Compilers


#include <iostream>
#include <climits>

using std::cout;
using std::endl;

volatile int char_min = CHAR_MIN;

int main(void)
{
cout << "Size of boolean type is "
<< sizeof(bool) << " byte(s)"
<< "\n\n";

cout << "Number of bits in a character: "
<< CHAR_BIT << '\n';
cout << "Size of character types is "
<
<< " byte" << '\n';
cout << "Signed char min: "
<< SCHAR_MIN << " max: "
<<< '\n';
cout << "Unsigned char min: 0 max: "
<< UCHAR_MAX << '\n';

cout << "Default char is ";

if (char_min < 0)
cout << "signed";
else if (char_min == 0)
cout << "unsigned";
else
cout << "non-standard";
cout << "\n\n";

cout << "Size of short int types is "
<< sizeof(short) << " bytes"
<< '\n';
cout << "Signed short min: "
<< SHRT_MIN << " max: "
<< SHRT_MAX << '\n';
cout << "Unsigned short min: 0 max: "
<< USHRT_MAX << "\n\n";

cout << "Size of int types is "
<< sizeof(int) << " bytes"
<< '\n';
cout << "Signed int min: "
<< INT_MIN << " max: "
<< INT_MAX << '\n';
cout << "Unsigned int min: 0 max: "
<< UINT_MAX << "\n\n";

cout << "Size of long int types is "
<< sizeof(long) << " bytes"
<< '\n';
cout << "Signed long min: " <<
LONG_MIN << " max: "
<< LONG_MAX << '\n';
cout << "Unsigned long min: 0 max: "
<< ULONG_MAX << endl;

return 0;
}

Keywords: C++, types, integer, int, short, long, char, bool, sizeof, C, compiler implementation of integer types, unsigned.

Sunday, November 12, 2006

HTML Validation Service (W3C Markup)

While exploring the internet trying to figure out a trivial little problem, I discovered a tool that the w3.org has provided to any and all users on the internet that I forgot about. It can validate a multitude of DOCTYPE from XHTML 1.0 to SVG 1.1 and several flavors in between.

A quote from the validator's FAQ:
"Most pages on the World Wide Web are written in computer languages (such as HTML) that allow Web authors to structure text, add multimedia content, and specify what appearance, or style, the result should have.

As for every language, these have their own grammar, vocabulary and syntax, and every document written with these computer languages are supposed to follow these rules. The (X)HTML languages, for all versions up to XHTML 1.1, are using machine-readable grammars called DTDs, a mechanism inherited from SGML.

However, Just as texts in a natural language can include spelling or grammar errors, documents using Markup languages may (for various reasons) not be following these rules. The process of verifying whether a document actually follows the rules for the language(s) it uses is called validation, and the tool used for that is a validator. A document that passes this process with success is called valid.

With these concepts in mind, we can define "markup validation" as the process of checking a Web document against the grammar (generally a DTD) it claims to be using."

This reminds me, I should remember to throw pages I publish through this tool. I expect it to be more pedantic than a web browser, but that is a good thing. I see a few things that I need to fix up on a could of my sites right now.

The Link: The W3C Markup Validation Service

They also provide a number of other tools such as:
A Link Checker
CSS Validator
Feed Validator
P3P Validator
RDF Validator
XML Schema Validator
HTML Semantic Extractor - Checks for metadata

Keywords: HTML validation, HTML, XML validation, XHTML, XHTML 1.0, XHTML 1.1, SVG, SVG 1.1, SVG 1.0, Web page validator, validate my html, validate the html on my website.

Thursday, November 09, 2006

HOWTO convert from Flash Video-FLV to AVI for free AKA Transcoding

This is a special request HOWTO on Transcoding. There is a soul out on the internet that wants to be able to convert Flash video (FLV) to AVI, MPEG (MPG), or WMV. For those of us out there that don't know this is called Transcoding. This is simply a matter of decoding the video to and an intermediate form and encoding the video to the chosen format.

Windows Instructions: (Linux instruction are down below)

Prerequisites for the method described below.
1. Firefox
2. The Video Downloader plugin for Firefox
3. Riva FLV Encoder 2. A free FLV encoder that can transcode as well.


The step-by-step HOWTO:

1. Pick a Flash Video to download. Click the Download Video Link.


2. Save the FLV video. Make sure to Rename it with an flv extension. It is weird that Video Downloader doesn't let us pick a name.


3. Start Riva FLV Encoder.
- Pick your FLV file or the input.
- Pick the location and name of output file
- Pick the extension you'd like to transcode into. (AVI or MPG). In my example I chose AVI.
- Press the encode button.


And there you have it, tada, you are done. When the Riva FLV Encoder 2 is complete, you've transcoded into another format.

Be aware that you might need to specifically have the codec for the AVI/MPG encoding you just did.

Reading briefly on the forum of Riva it appears that there are occasional gitches, sound problems with transcoding. It isn't a specifically supported operation, either.



Riva Links:
Riva Homepage
Riva Forums

Do you want to do this for Linux?
Take a look at this: Converting flv to mpeg in Linux
Of course you can download a flv file with Firefox in Linux

You will need ffmpeg.
And the simple commandline:
ffmpeg -i videotoconvert.flv -ab 56 -ar 22050 -b 500 -s 320x240 output.mpg


Keywords: Transcoding, FLV2AVI, FLV2MPG, FLV2MPEG, Convert Flash Video to AVI or MPEG/MPG, Convert FLV to AVI, Convert FLV to MPG, Howto Convert FLV to WMV, convert youtube videos, convert google videos.

Monday, November 06, 2006

InterfaceLIFT: High-Resolution Wallpaper

Not that this is very technical, but yesterday I found a good site for sharing and getting wallpaper/desktop backgrounds.

InterfaceLIFT's content is entirely vistor-submitted and is intended to be shared. They do a great job of search and provided a multitude of resolutions from 1024x768 to 2560X1600 as well as a number of other formats (ipod/sony psp).

My search for "Seattle" returned several of great photos of the city. Another search for "Vancouver" returned even more.

InterfaceLIFT: High-Resolution Wallpaper

The more sharing, the merrier and who can complain at the price of nothing?

And if you are looking for other art they have icons too. Have a new app that you want a cool icon for? I'd check these ones out.

Paul

Keywords: Wallpaper, Desktop background, images, free wallpaper, linux, windows, icons

Friday, November 03, 2006

HOWTO create ISO images and mount ISO images linux

As a comparison to my last post, all that work of installing applications is moot in Linux.

To create an image (link to a more verbose explanation) use dd on an unmounted CD/DVD drive:
dd if=/dev/cd of=cd.iso

To mount an image, if your kernel is compiled with the loopback block device and ISO 9660 built in (link to a more verbose explanation):

mount -o loop -t iso9660 cd.iso /mnt/isoimage/

Wikipedia on ISO

Keywords: ISO, mounting, ISO mounting, CD emulator, DVD emulator, Daemon Tools, Mount ISO, Linux, Linux CD emulator, Linux DVD emulator, Gentoo Linux.

Mount an ISO (CD/DVD) image in Windows CD/DVD emulator

CDs/DVDs, who needs them?

It is better to just mount the ISO image you need and live in the virtual world; when that works for you. It is harder to do an OS installation like that, but for many other cases, this is fine.

For Windows the best way to do that is to use Daemon Tools CD/DVD emulator.

the Daemon Tools download link

However, you may use the MS tool (not as slick as daemon-tools and unsupported ), it can be downloaded at:

Microsoft's Virtual CD/ISO mounting tool (Virtual CD Control panel)


Don't have an ISO? Create your own: ISO Recorder v2. This allows a simple right click on a CD drive to write an iso. e.g:

Wikipedia on ISO

Keywords: ISO, mounting, ISO mounting, CD emulator, DVD emulator, Daemon Tools, Mount ISO, Windows, Windows CD emulator, Windows DVD emulator.

Windows and multiple file renaming (creating a sequence)

I never thought to try this, but the multiple rename is built in to Windows. It is simple enough. (1) Select multiple files. (2) Rename one of them. (3) The others will be renamed and have a sequence of numbers appended to the end of the file name. It is a little more verbose at the below link.

Tips, Articles & Reviews on Windows, Gadgets, Web Services, etc, Collected in My Bucket: Rename Multiple Files For Free

Keywords: multiple file rename in Windows, rename files, Windows, Windows XP, rename multiple files windows GUI.

Links to Search Engine Webmaster Tools (Google, Yahoo, MSN)

After the invent of Google's Webmaster tools there appeared the Yahoo Site Manager. I suspect that MSN is going to be next in that space. It is in these tools that the user gets the ability to configure some of the high level details for search engine optimization.

(1) Google - Webmaster Tools
(2) Yahoo - Site Explorer
(3) MSN/live search - MSN Search Web Crawler and Site Indexing Tools and Services for Site Owners - No tools/submission
(4) Ask.com - Webcrawler information - No tools/submission

For those of you that are casual interested in increasing your page ranks/positions, this would be these would be the places to go.

At this point in time, the primary functionality these appear to provide is the ability to submit Sitemap feeds to the search engine. Allowing for two methods of listing sites for these search engines, (one) the crawler/bot and (two) the submission of URLs.

Prior blogs on a similar vein:
(1) Google Webmaster Tools and HOWTO/how to add a sitemap for Blogger/blogspot.com
(2) Google webmaster Tools and HOWTO/how to verify your Blogger/blogspot.com site

Wednesday, November 01, 2006

HOWTO use logparser to find what machine locked out an account

I've been in a number of organizations where the mystery of who, what, where, when and how an account got locked out is umm, a mystery. This is because the regular login/logout data and other authentication data is bundled in with the 1 or 10 errors per day. The truth is obfuscated by too much data. The biggest problem always appears to be with service accounts, with a number of dependencies on an account.

It turns out it can be relatively simple to right a MS logparser query to hunt out this information. AKA, logparser is your best friend. The second think to note is EventID 644 indicates the event that is written when an account is locked out. The rest is really the details.

  1. install Logparser - Logparser download from Microsoft
  2. Create a file by the name of lockedaccounts.sql at the same directory as your logparser.exe (or add the folder that holds logparser.exe to the path).
    file contents:

    SELECT timegenerated AS TimeLockedout,
    extract_token(strings, 0, '|') As UserName ,
    extract_token(strings, 1, '|') AS OriginatingMachine,
    EventID,
    SourceName,
    Message,
    CASE EventID
    WHEN 529 THEN 'Invalid userid/password'
    WHEN 531 Then 'Account disabled out'
    WHEN 539 Then 'Account locked out'
    WHEN 530 Then 'Outside of logon time'
    WHEN 532 THEN 'Account Expired'
    WHEN 535 THEN 'Password Expired'
    WHEN 533 THEN 'User not from allowed system'
    WHEN 644 THEN 'Account Auto Locked'
    WHEN 540 THEN 'Successful logon'
    ELSE 'Not specified' END AS EventDesc,
    strings
    INTO lockedact.csv
    FROM \\%DOMAINCONTROLER%\Security
    WHERE EventID=644
  3. run the following command: (it has a 90 second run time on ~500,000 remote eventviewer records)
    C:\>logparser file:lockedaccounts.sql?DOMAINCONTROLER=ADOMAINCONTROLER
  1. Open the lockedact.csv file in Excel. Hunt out the account you want to analyze. The Column ‘OriginatingMachine’ is the machine that locked out the account. The other columns are there for info only. Note that EventID 644 is the one you are interested in (http://www.ultimatewindowssecurity.com/events/com264.htm ).

For more (much, much more) on logparser: http://www.logparser.com

For a more elaboration on logparser scripts see my blog entry on logparser here: Logparser examples and more.

Keywords:Windows, Active Directory, how to, HOWTO use Microsoft logpaser to find what machine locked out an account in Windows, what machine locked out an account.

HOWTO regexp Visual Studios and automating code/sql/data creation

This is my little documentation area of regexps that I create when I need to convert one for of text to another. Being in Windows, I've taken to simply using the Visual Studios regexp Find and Replace.

My plan is to update this entry as do other regexps in my day to day work life.

First off feel free to check the Use: Regular Expression button in the Find and Replace Dialog



(1) IDL definition into PL/SQL
To replace:
LINEITEMFLAG_****** = 1, //bit 1
with
insert into lineitem_flags values ('1', 'LINEITEMFLAG_****** Description', 'LINEITEMFLAG_******');

regexp search for: {LINEITEMFLAG_[A-Z,_]*}:b*=:b*{.*},.*$
regexp replace with: insert into lineitem_flags values ('\2', '\1 Description', '\1');

Thursday, October 26, 2006

HOWTO create your 'own' Google custom search engine

Feature of the day from Google: build your own custom search engine from Google's data.

Check it out at Google Co-op Custom Search Engine.

Google's FAQ is here.

Why would you want this? For instance what if you wanted a search engine for your own 7 sites. You could use this to make a search for these 7 sites.

  1. Go to Google Co-op Custom Search Engine
  2. Click on the Create a Search Engine.
  3. Login to your Google account
  4. Setup the 6 configuration points
    Name: The name for your custom search engine
    Description: A longer description
    Keywords: These are the keywords that pages that have them will be promoted in the search results. For example if you wanted to promote FAQ for your documentation search engine, the docs with FAQ will rank higher in the results.
    Sites to Search: the urls of sites you'd like in your search engine
    How to Search: Whether or not you'd like to include the 'internet' in the results or exclude it.
    Contributors: Whether or not it is a collaboration

  5. Hit the Next button and you have your custom search engine. The following page is simply to let you try it and to confirm it was created.


    I've created a search engine for my sites Paul Cooley's Sites and for Linux documentation
Going into the control panel for these sites you will find additional options like updating the look and feel, refinements, adding sites, code to be able to inject this custom search engine in a site, etc.

This ability to create these custom search engines and have them on our sites will be a big step. We now can filter the contents according to our suggestions. I think the opportunities will only be limited to our imaginations.

On that note, I wonder if there is an API. Imagine tweaking a 'search' dialog on the fly to the specifics a page or according to user selection?

Monday, October 23, 2006

Gentoo Linux HOWTO configure a SOCKS proxy server

Inspired by my wife's difficulty connecting to the internet due to new security policies at her organization, I decided to try a proxy to allow her to use Windows Live Messenger.


The Linux SOCKS proxy server implementation these days is made under the name DANTE. Their site is here.

In Gentoo it is in our Portage Tree so the step-by-step is here:

  • emerge dante
  • edit the config file (/etc/socks/sockd.conf). Open that file in your favorite editor
    It is in this file logging is enabled via the syslog mechanism and internal and external addresses are bound. Whereas the internal bindings include a port specification, the external one does not.
    The comments are well formed I'd also spend a little time looking them over.

    The details:
    logoutput: syslog

    internal: eth1 port = 1080
    internal: 127.0.0.1 port = 1080

    external: 1.2.3.4
    # or
    external: eth0
    To achieve full access (no username/password).
    method: username none

    # Not using authentication, so unnecessary
    #user.privileged: proxy

    user.notprivileged: nobody
    The access controls for sockd daemon are last. They are checked against in the order they appear in the configuration file. Notice, don't open your proxy server to the wild world - you've been warned.

    The first three directives control which IP ranges that have accesss to the server.
    - The from: is were the details of the IPs are added. In my cause it is the IP space the clients live in.
    - The to: option is one of the IPs the proxy server is bound to that the given IP range can speak to. It is set to the addresses Dante/sockd is listening on.
    The last of the three drops any requests that don't match either of the first two directives.

    client pass {
    from: 192.168.0.0/16 port 1-65535 to: 0.0.0.0/0
    }

    client pass {
    from: 127.0.0.0/8 port 1-65535 to: 0.0.0.0/0
    }

    client block {
    from: 0.0.0.0/0 to: 0.0.0.0/0
    log: connect error
    }
    The next four configuration points control who 'routing'.
    - Requests from anywhere to the loopback addresses are dropped.
    - From the loopback addresses and 192.168.0.0/16 are allowed to communicated over tcp or udp protocols.
    - Finally, drop everything else.
    block {
    from: 0.0.0.0/0 to: 127.0.0.0/8
    log: connect error
    }

    pass {
    from: 192.168.0.0/16 to: 0.0.0.0/0
    protocol: tcp udp
    }

    pass {
    from: 127.0.0.0/8 to: 0.0.0.0/0
    protocol: tcp udp
    }

    block {
    from: 0.0.0.0/0 to: 0.0.0.0/0
    log: connect error
    }
  • Start Dante/sockd.
    sockd -V // this verifies configuration and exits
    sockd -d // this enables debugging to the console.
    That will start Dante in debugging mode.

The help page for your reference
localhost ~ # sockd -h
sockd: usage: sockd [-DLNVdfhnv]
-D : run in daemon mode
-L : shows the license for this program
-N : fork of servers [1]
-V : verify configuration and exit
-d : enable debugging
-f <filename> : use <filename> as configuration file [/etc/socks/sockd.conf]
-h : print this information
-n : disable TCP keep-alive
-v : print version info

if you'd like sockd to start on the default runlevel:
rc-update add sockd default

Next would be configuring your browser and test this. Using IE configure it to use a proxy server and enter the server name port (1080), close the browser. Restart the browser and request a page. If it works, then great move on. Otherwise you'll start to debug (inspection of /var/log/* time).



Gentoo adding a service to the default run-level

This is done millions of times a week, I am sure. I just want to write it down so I hope to imprint it on my brain. It is very simple with Gentoo.

rc-update is the tool that Gentoo uses to abstract the guts that often are associated with adding a service to certain runlevels.

Example with adding ntpd to the default run level.

rc-update add ntpd default

RC-UPDATE MAN page details.
NAME
rc-update - add and remove init scripts to a runlevel

SYNOPSIS
rc-update add script <runlevels>
rc-update del script [runlevels]
rc-update show [--verbose] [runlevels]

DESCRIPTION
Gentoo's init system uses named runlevels. Rather than editing some
obscure file or managing a directory of symlinks, rc-update exists to
quickly add or delete init scripts from different runlevels.

All scripts specified with this utility must reside in the /etc/init.d
directory. They must also conform to the Gentoo runscript standard.

Saturday, October 21, 2006

Improved Seattle (WSDOT) traffic flow for mobile devices

My wife found that the Washington State Department of Transportation started creating traffic flow maps for mobile devices for the Greater Seattle (Puget Sound) area. One problem though, the page layout isn't ideal for most of our commuting; the maps are split across our routes. Downloading two pages on your blackberry? No thanks.

The solution? Build our own HTML page that references the all the small images so all of the images are loaded with one page. Then create links to the other reference pages such as the travel times.

Do you want to use our Seattle traffic flow maps for mobile or wireless devices? Just use the link you like better:

http://traffic.paulcooley.com
http://traffic.lauracooley.com

Happy Commuting in the Puget Sound area!
Paul Cooley

Seattle (WSDOT) traffic flow for handheld devices

Tuesday, October 17, 2006

HOWTO have a traffic map(image) as your screen saver

This HOWTO is for Windows XP. Traffic maps are becoming increasingly predominant. The age of the internet has improved us? In Seattle the traffic map is actually rendered as a GIF image.

If you are in the lucky few cities of:
Atlanta, San Francisco Bay Area, Chicago, Denver, Houston, Los Angeles, Louisville, Milwaukee, Minneapolis, OC/Inland Empire, Phoenix, Portland, Salt Lake City, San Diego, San Fernando Valley, Seattle

You can use the TrafficGauge download.

However, I've found the WSDOT traffic map is much more detailed than TrafficGauge.

An alternative solution is:

  • download SeqDownload - This utility allows you to schedule and automatic download of an image file
  • Install SeqDownload
  • Configure SeqDownload
  1. Run SeqDownload
  2. Click the Run at Startup, Click the Run In System Menu
  3. Click New (for a new scheduled download)
  4. Configure the URL (for Seattle: http://images.wsdot.wa.gov/nwflow/
    flowmaps/videomap_Seattle.gif)
    the link
    - Pick a New folder to download images into (c:\traffic\download)
    - Pick a time frame to download image (5 minutes?)
    - Select save to the same filename everytime
    - Pick the length of time for this downloading (1 year?)
    Press Create New Item

  • Setup the Window XP Screensaver to use this folder, of one image, for its Slide Show
  1. Right-click on the desktop - select Properties (or your favorite method to change the screensaver)
  2. Select the Screen Saver Tab followed by My Pictures Slideshow
  3. Select the directory that you are downloading the traffic image to (from the SeqDownload above - in my case C:\traffic\download)


Click OK and now the Seattle traffic map will be your screen saver, updating every 5 minutes with a new image.

Thursday, October 12, 2006

Google Webmaster Tools and HOWTO add a sitemap for Blogger/blogspot.com

The setting up of a Sitemap file for a Blogger/Blogspot.com is an easy thing to do. The question of what is a Sitemap file and why should I have one? is answered here

Prerequite:
(1) your site setup/verified in Google Webmaster Tools (more on that here my blog entry on that).

HOWTO:
(1) Login to Webmaster Tools (link)
(2) Click On the Add a Sitemap link
(3) Choose Add General Sitemap
(4) type in: *********.blogspot.com/rss.xml (your blog name for *********)
(5) Click on Add Sitemap button.

You are done! That is it. Simple.

After posting this I found that there is also an equivelant site at Yahoo - Yahoo Site Explorer It functions similarly and you could basically do the same thing over there.

Google webmaster Tools and HOWTO verify your Blogger/blogspot.com site

There are two important things you can do with the Google Webmaster tools for your blogger site.

One -> Verify your site
and
Two -> Submit a sitemap.

What we are interested in here is verifying a Blogger site (ONE!). So here is how to do that.

** Google Webmaster Tools Verify your Blogger site. **

(1) Control your blog site through Google Webmaster tools - login/signup for Google's Webmaster tools and add your blogger site *****.blogspot.com
(2) Click on the Verify link
(3) Choose Add a Metatag - note the META tag information
(4) In New Tab (Browser): Login to your Blogger account. Click on your blog of interest and click on the template tab. After the < >> tag paste in the META tag information as found in step (3)
(5) Republish your entire blog.
(6) Go back to the Google Webmaster tools - Click Verify. You are done and your blog has been verified. The next step you'll want to do is add a sitemap.

Wednesday, October 11, 2006

Tunneling Remote Desktop through SSH

I just googled this "port forwarding remote desktop putty" and just realized this is easy. I am remote desktop'ing' from windows machines outside of my home network into my home LAN into my Windows XP machine. PuTTY and openSSH can make this easy, maybe even ultra easy!

Prerequistes:

  1. A server running SSH (openSSH?) - of course this is a Gentoo Linux server at my house but it doesn't need to be. * Configuration for this is outside the scope of this blog.
  2. puTTY - download to the computer that will be the client
Setup Putty:

PuTTY is a free software SSH, Telnet, rlogin, and raw TCP client. It is perfect for this.
  1. Setup the Tunnel Port Forwarding in puTTY.
    - Click on the Tunnel. (configure the Tunnel configuration)
    - Enter a source port, this is the port on the local machine, 3390.
    - Enter a Destination IP or name and port number of 3389 using a colon separator character. An example is for a Remote Desktop session to the PC machinewithRDP.
    - Click the Add button
    - Repeat for other hosts
    - Click on SSH (configure the high level SSH)
    - Turn on compression
    - Use only SSH 2
    - Click on Session (configure session)
    - Save your settings
  2. Connect to the SSH server with your username/password
  3. Start up Remote Desktop
    - use the address of the port you setup in puTTY. localhost:3390
    - click Connect


You are done and connected! Nice work.

Keywords: Tunneling Remote Desktop through SSH with Putty and openSSH. RDP, Windows, Tunnel, Gentoo, Linux, D-Link Wireless Router, DI-724 DU.

Tunneling HTTP/WEB/Port 80 traffic(requests) thru SSH

I just bumped into something I've never done before but think is really cool. Tunnelling HTTP/Port 80 traffic through SSH.

A scenario that is valid for me. I would like to view a intranet website that is behind a firewall/router. The website isn't meant for public consumption, but for myself. For instance the D-Link Wireless router DI-724DU configuration page for my home network.

What you need:
(1) Server servicing SSH -- in my case a Gentoo Linux machine running SSHD and configured correctly). Reference name: remotehost.remotedns.org
(2) The DI-724DU (or some other router with port forwarding functionality). Reference name: di724-192-168-0-1 (192.168.0.1 is the default ip)
(3) The SSH port being forwarded to the above named server .
(4) A SSH client -- for me openSSH on an VMWare server with Gentoo on it (outside of my home LAN): Reference name: host.outsidenetwork.com

On the VMWare Gentoo Linux machine that is outside of my home LAN simply type:
(1) ssh -L 2022:192.168.0.1:80 username@remotehost.remotedns.org
where 192.168.0.1 is IP of the router di724-192-168-0-1
(2) Enter the username's password on the remotehost.remotedns.org machine
(3) Start a browser on host.outsidenetwork.com. Enter http://localhost:2022.

You will get the webpage of the D-Link DI-724 wireless router.

Cool! Simple as 1,2,3!