<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:media="http://search.yahoo.com/mrss/"><channel><title><![CDATA[XYZPDQ]]></title><description><![CDATA[Thoughts, stories and ideas.]]></description><link>http://184.73.26.214:80/</link><generator>Ghost 3.20</generator><lastBuildDate>Thu, 25 Sep 2025 05:43:16 GMT</lastBuildDate><atom:link href="http://184.73.26.214:80/rss/" rel="self" type="application/rss+xml"/><ttl>60</ttl><item><title><![CDATA[Finding nearby cities using SQL Server]]></title><description><![CDATA[Describes how to use SQL Server's spatial capabilities, and freely available geographical data to find all cities within a given distance from a center point. ]]></description><link>http://184.73.26.214:80/finding-nearby-cities-using-sql-server/</link><guid isPermaLink="false">5eeffd511e769b0ce1658efb</guid><category><![CDATA[SQL Server]]></category><category><![CDATA[Spatial]]></category><dc:creator><![CDATA[Cody Beckner]]></dc:creator><pubDate>Mon, 05 Dec 2016 00:12:26 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>A scenario arose recently that required me to be able to find all of the cities within a given radius from a geographic center-point (latitude / longitude).</p>
<p>At first I was tempted to utilize an existing mapping service to handle this for me, but most of those services impose limits on the number of requests that you can send in a given time period.  In this instance, I needed something that was not going to be subject to those limits.</p>
<p>There are a lot of different ways to solve this problem.  In this post, I am going to cover how to do so using SQL Server 2012+.</p>
<p>The first step is to locate a list of cities and their latitude/longitude.  A good source for this is <a href="http://www.geonames.org/">GeoNames.org</a>.  There are a wide variety of data downloads available.  For my purposes, I am going to grab <a href="http://download.geonames.org/export/dump/cities5000.zip">Cities5000.zip</a>.  This archive contains a list of all of the cities with a population of 5,000 or more.</p>
<p>There is a lot of really good documentation about what the download contains, but for ease of reference, I'll reproduce it here:</p>
<pre><code>geonameid         : integer id of record in geonames database
name              : name of geographical point (utf8) varchar(200)
asciiname         : name of geographical point in plain ascii characters, varchar(200)
alternatenames    : alternatenames, comma separated, ascii names automatically transliterated, convenience attribute from alternatename table, varchar(10000)
latitude          : latitude in decimal degrees (wgs84)
longitude         : longitude in decimal degrees (wgs84)
feature class     : see http://www.geonames.org/export/codes.html, char(1)
feature code      : see http://www.geonames.org/export/codes.html, varchar(10)
country code      : ISO-3166 2-letter country code, 2 characters
cc2               : alternate country codes, comma separated, ISO-3166 2-letter country code, 200 characters
admin1 code       : fipscode (subject to change to iso code), see exceptions below, see file admin1Codes.txt for display names of this code; varchar(20)
admin2 code       : code for the second administrative division, a county in the US, see file admin2Codes.txt; varchar(80) 
admin3 code       : code for third level administrative division, varchar(20)
admin4 code       : code for fourth level administrative division, varchar(20)
population        : bigint (8 byte int) 
elevation         : in meters, integer
dem               : digital elevation model, srtm3 or gtopo30, average elevation of 3''x3'' (ca 90mx90m) or 30''x30'' (ca 900mx900m) area in meters, integer. srtm processed by cgiar/ciat.
timezone          : the iana timezone id (see file timeZone.txt) varchar(40)
modification date : date of last modification in yyyy-MM-dd format
</code></pre>
<p>You can pull all or part of the data into SQL.  I am going to pull in all the columns.  I don't need all of them at the moment, but they could be useful later on.</p>
<p>The following will create a table for the city data and use <code>bulk import</code> to populate it.<br>
<em>Note: this assumes that you have unzipped the data to <strong>c:\temp</strong></em></p>
<p>First create the table</p>
<pre><code class="language-sql">CREATE TABLE Cities (
geonameid       int	NOT NULL PRIMARY KEY,
[name]          nvarchar(200) NOT NULL,              
asciiname       nvarchar(200) NOT NULL,
alternatenames  nvarchar(max),  
latitude        numeric(18,15),  
longitude       numeric(18,15), 
feature_class   char(1), 
feature_code	varchar(10),      
country_code    char(2),  
cc2             nvarchar(200),  
admin1_code     nvarchar(20),  
admin2_code     nvarchar(80),  
admin3_code     nvarchar(20),  
admin4_code     nvarchar(20),  
[population]	decimal,
elevation       int,  
dem             int,  
timezone        nvarchar(40),  
modification_date_tmp	nvarchar(50)
)
</code></pre>
<p>Now, import the data</p>
<pre><code class="language-sql">BULK INSERT Cities FROM 'c:\temp\cities5000.txt'
WITH
(
FIELDTERMINATOR = '\t',
ROWTERMINATOR = '\n'
);
</code></pre>
<p>Ok.  We have our raw data.  Now we need to let SQL Server know how to plot each of these cities on a geographic plane.  To do this, we are going to use the <code>geography</code> spatial data type that is available in SQL Server.  <code>geography</code> is actually a .Net CLR data type that is specifically tailored to work with latitude and longitude.  You can read more about <code>geography</code> <a href="https://msdn.microsoft.com/en-us/library/cc280766.aspx">here</a></p>
<pre><code class="language-sql">ALTER TABLE Cities 
ADD Point AS CONVERT([geography],
                     CASE WHEN [Latitude]&lt;&gt;(0) 
                          AND [Longitude]&lt;&gt;(0) 
                     THEN Geography::Point([Latitude],[Longitude],(4326))  
                     END,(0))

</code></pre>
<p>Let's break down what is happening above.</p>
<p>First, I'm adding a <code>Point</code> as a column to <code>Cities</code>.  In this instance, I'm creating a computed column.  As <code>Latitude</code> or <code>Longitude</code> is updated, this will automatically update the value for <code>Point</code>.  If you would prefer, you can create <code>Point</code> as a <code>geography</code> column and run an <code>update</code> statement to do this calculation once. (just remember to re-run <code>update</code> if you re-import the data!).</p>
<p>I'm using <code>CONVERT</code> to turn the data into a <code>geography</code> type.</p>
<p>I'm wrapping the data in a <code>CASE</code> statement to skip the conversion if <code>Latitude</code> and <code>Longitude</code> aren't set properly.</p>
<p>Finally, I'm converting the <code>Latitude</code> and <code>Longitude</code> into a geographical point using <code>Geography::Point()</code>.  The <code>::</code> is SQL Server syntax for calling the <code>Point</code> method on the <code>geography</code> data type.  Think of it like a static method on a class.</p>
<p>In this case, <code>Point</code> takes a latitude, a longitude and something called a SRID (Spatial Reference Identifier).  In this case, I'm using <a href="http://spatialreference.org/ref/epsg/wgs-84/">4326</a> which gives us the standard -180:180/-90:90 that you're probably used to seeing.</p>
<p>If you open SQL Server Management Studio (SSMS) and select all the records from cities, you should see a new tab called &quot;Spatial Results&quot;.  Clicking that will give you something similar this.</p>
<p><img src="https://xyzpdq-blog.s3.amazonaws.com/2016/Dec/cities-1480894858347.PNG" alt="First 5,000 cities"></p>
<p>Great!</p>
<p>Now then, what if we want to use this data to find all of the cities within 5 miles of Manhattan?</p>
<p>First, let's grab the <code>Point</code> from Manhattan.</p>
<pre><code class="language-sql">SELECT Name, Point FROM cities WHERE name = 'Manhattan' AND admin1_code = 'NY'
</code></pre>
<p>The result should resemble the following:</p>
<pre><code>Manhattan	0xE6100000010C475A2A6F47644440A4703D0AD77D52C0
</code></pre>
<p>That hex string on the right is the text representation of Manhattan's center-point.</p>
<p>To draw a 25 mile circle around that center-point we are going to use another built-in method called <a href="https://msdn.microsoft.com/en-us/library/bb933965.aspx">STBuffer</a>.</p>
<p>STBuffer takes a single argument of Distance which is defined in meters.  Since we are trying to work in miles, we are going to have to do a conversion.</p>
<pre><code class="language-sql">SELECT Name, Point, Point.STBuffer(5 * 1609.344) as SearchArea
FROM cities 
WHERE name = 'Manhattan' 
AND admin1_code = 'NY'
</code></pre>
<p>The <strong>1609.344</strong> in the query above is the approximate meters per mile.</p>
<p>This time if you look at the Spatial Results (and select <code>SearchArea</code> from the dropdown on the right) you'll see something like this:</p>
<p><img src="https://xyzpdq-blog.s3.amazonaws.com/2016/Dec/manhattan25-1480895771302.PNG" alt="Manhattan 25 mile radius"></p>
<p>Now for the final step.</p>
<pre><code class="language-sql">DECLARE @SearchArea GEOGRAPHY

SELECT @SearchArea = Point.STBuffer(5*1609.344)
FROM cities 
WHERE name = 'Manhattan' 
AND admin1_code = 'NY'

SELECT Name, Point
FROM Cities
WHERE [point].STIntersects(@SearchArea) = 1 
</code></pre>
<p>In the query above, I have saved the search area we defined to a variable so that it is easier to use later on.</p>
<p>The heavy lifting here is being done by <a href="https://msdn.microsoft.com/en-us/library/bb933962.aspx">STIntersects</a>.  Like STBuffer, STIntersects is a method available off of the <code>geography</code> type.  In this case, it takes another <code>geography</code> type as its argument.  STIntersects will determine if the two <code>geography</code> instances cross over one another.</p>
<p>It is important to notice that this ellipse from the previous step is solid and not outlined.  Since our goal is to find everything contained inside of the radius, if we only had the outline of a circle, then the only place that it would intersect would be along the exact outside of the circle.  In all likelihood we would not get any results at all!</p>
<p>What did we get for all of our hard effort?  Here are all the cities (with a population of 5,000 or more) within 5 miles of Manhattan's center-point.</p>
<blockquote>
<p>Cliffside Park<br>
Edgewater<br>
Fairview<br>
Fort Lee<br>
Guttenberg<br>
Hoboken<br>
North Bergen<br>
Palisades Park<br>
Ridgefield<br>
Secaucus<br>
Union City<br>
Weehawken<br>
West New York<br>
Long Island City<br>
Manhattan</p>
</blockquote>
<p>This barely scratches the surface of what is available as part of the Geography/Geometry functionality inside of SQL Server.  For a more complete list of all of the OGC (STXXXX) methods refer to the <a href="https://msdn.microsoft.com/en-us/library/bb933917.aspx">MSDN article</a></p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Working in miniature]]></title><description><![CDATA[<!--kg-card-begin: markdown--><p>Working with computers day in and day out can leave me burned out.  I’ve tried a variety of ways to deal with this over the years, and about 7 years ago, I keyed in on one of my other passions and leveraged that as a way to “unplug” (so</p>]]></description><link>http://184.73.26.214:80/working-in-miniature/</link><guid isPermaLink="false">5eeffd511e769b0ce1658ef4</guid><category><![CDATA[unplugged]]></category><category><![CDATA[ships]]></category><dc:creator><![CDATA[Cody Beckner]]></dc:creator><pubDate>Sun, 09 Jun 2013 16:55:20 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>Working with computers day in and day out can leave me burned out.  I’ve tried a variety of ways to deal with this over the years, and about 7 years ago, I keyed in on one of my other passions and leveraged that as a way to “unplug” (so to speak).  I am fascinated with the age of sail.  To me, tall ships are some of the most majestic things that have ever been built by man.  While I am aware of the grim realities of life in that era, it is still possible to romanticize it.  However, I digress.  I leveraged this passion and began building model ships.  I started with plastic models and quickly burned through a variety of models due to the relative simplicity of putting together a plastic model.  Then, I few years ago, my father gave me a wooden ship model.  I loved it.  That is when I truly began to turn to model ship building as a way to turn off my digital life.  Since each model takes months and in some cases years to finish, there is always something there waiting for me.</p>
<p>I’ve built several more wood models since that first one that my father gave me, and I have been slowly learning tricks and techniques to add more and more detail to each subsequent model.  Most recently I have been working on a cross section of the H.M.S. Victory.  As I have been tying down the rigging, I started thinking about ways to up the level of detail on this model.  One of the ways that I am trying to accomplish this is by adding the rope coils that would exist on the pins and decking of a working ship.  At the scale that I am working, this proved to be an interesting exercise.  I wanted to make two different types of coils.  the first is the type that all of us are familiar with, the type that most of us use to bundle extension cords and lengths of rope that we have laying about the house.  The second type are a spiral that would lay flat on the deck and out of the way.</p>
<p>The first type of coil is relatively easy.  You need scale rope (thread will work), tweezers and some glue.  Toward the narrow end of the tweezers, start wrapping the thread around using your thumb to hold the loops in place.  Depending on the size of the coil you want to make, add additional loops.  Don’t wrap too tightly so that you don’t start compressing the tweezers.  To help with this, depending on how dexterous you are, you can use one of the fingers your are holding the tweezers with to keep them separated.  Once you have the correct amount of loops (I recommend 4-6), take the thread and wrap it once lengthwise inside the tweezers around the coil that you just created.  Start pulling that single lengthwise loop tight as you slowly move the coil toward the end of the tweezers.  DO NOT take the coil off the tweezers yet.  By moving the coil toward the end of the tweezers, it should allow you to fully tighten the lengthwise loop around the coil.  Continue wrapping additional lengthwise loops until you are content with the finished product.  At this point, you should have something that resembles the picture.  For the final loop, cross the thread back under itself (just like you would on a larger version of the coil).  With the coil still on the tweezers, apply some glue to the lengthwise coils and then set it aside to let it dry.</p>
<p>The second type of coil is our flat spiral.  If you are familiar with sailing lexicon, this is called a <a href="http://www.animatedknots.com/flemish/index.php">Flemish Flake</a>.  Normally, these are started from the outside in, but in our case, we are going to start them from the inside out.  We are going to need wide packing or masking tape, tweezers, scale rope (thread will work again), and some white glue.  Tear off a piece of tape about 2-3 inches long.  With a length of rope/thread use the tweezers and press it down onto the tape.  Using the tweezers to keep the rope pinned against the tape, start wrapping the rope around the center point.  Once you have a full revolution around the center point, pull up the tweezers and press out the circle you have made so that it is flat against the tape.  Again, use the tweezers to hold the center and start wrapping the rope around the coil keeping the rope as close to parallel to the tape as possible.  What you are trying to do is pull the rope tight against the previous round of the coil using the tweezers to create a barrier between the rope and the table that the tape is laying on without letting the rope to double up on itself.  The first few rounds of the coil are the most delicate, the most you have, the easier the going.  After you have created a coil / flake of the desired size, take some white glue and sub it across the surface.  Once it dries, you will be able to pull the coil off of the tape and it will stay in the desired shape.</p>
<p>From here, it is up to your imagination about where to place your new creations.  In both of the above, it is advantageous to leave a length of rope/thread off of the end of your coil so that you can use that to blend your coil in with the rigging that you have run on your model.</p>
<div class="container-fluid">
<div class="row">
<div class="col-md-4">
<img src="https://xyzpdq-blog.s3.amazonaws.com/2016/Dec/IMG_0452-1480900709613.jpg" class="img-responsive img-thumbnail">
</div>
<div class="col-md-4">
<img src="https://xyzpdq-blog.s3.amazonaws.com/2016/Dec/IMG_0449-1480900737977.jpg" class="img-responsive img-thumbnail">
</div>
<div class="col-md-4">
<img src="https://xyzpdq-blog.s3.amazonaws.com/2016/Dec/IMG_0453-1480900763371.jpg" class="img-responsive img-thumbnail">
</div>
</div>
</div>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Monitor Windows EC2 Instance Free Drivespace via Cloudwatch]]></title><description><![CDATA[<!--kg-card-begin: markdown--><p>Amazon offers a lot of great <a href="http://aws.amazon.com/cloudwatch/">CloudWatch</a> metrics out of the box for EC2 instances, but there is a key metric that is missing.  Remaining available drive space.</p>
<p>Luckily, there is a relatively easy way to resolve this.  AWS offers us the ability to add in our own metrics that</p>]]></description><link>http://184.73.26.214:80/monitor-windows-ec2-instance-free-drivespace-via-cloudwatch/</link><guid isPermaLink="false">5eeffd511e769b0ce1658ef3</guid><dc:creator><![CDATA[Cody Beckner]]></dc:creator><pubDate>Sun, 07 Apr 2013 11:09:10 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>Amazon offers a lot of great <a href="http://aws.amazon.com/cloudwatch/">CloudWatch</a> metrics out of the box for EC2 instances, but there is a key metric that is missing.  Remaining available drive space.</p>
<p>Luckily, there is a relatively easy way to resolve this.  AWS offers us the ability to add in our own metrics that allow us to monitor pretty much anything we want.  In this case, we are going to create something to report back to CloudWatch about the free drivespace on our EC2 instance.</p>
<p>In preparation for this exercise, we are going to do a little prep work.</p>
<p>To start, ensure that the <a href="http://aws.amazon.com/sdkfornet/">AWS SDK for .Net</a> is available on your EC2 instance.  It should exist in <code>C:\Program Files (x86)\AWS SDK for .NET\</code> by default.  If it doesn’t for whatever reason, download and install it.</p>
<p>Next, collect your Access Key and the Access Key Secret and put them to the side.  They’ll be used shortly.</p>
<p>The last thing you will need to gather is the instance ID for the EC2 instance that we’re going to collect metrics for.  If you are uncertain what this is, there are a few different ways to find it.  If you are running a more recent image, it is in the top right corner on the background image.  The best way to get the instance ID however is to log in to the AWS Management Console, go to the EC2 section and the instance ID will be shown in the grid.  It should start with a lowercase “i” followed by a dash and a hex string. e.g. i-a1b2c3d4</p>
<p>The next step is putting together a PowerShell script that will utilize all of the things that we have collected to this point to gather and report on the metrics that we’re after.</p>
<p>So, let’s get started.</p>
<p>Open your preferred PowerShell script editor and create a new ps1 file.  It doesn’t matter what it is called. I named mine **updatedrivespacemetrics.ps1 **for clarity.</p>
<p>At the top of the script, we are going to import the AWS assembly and set the access key values.</p>
<pre><code class="language-powershell">#Import the AWS SDK assembly  
Add-Type -Path &quot;C:\Program Files (x86)\AWS SDK for .NET\bin\AWSSDK.dll&quot; 

#Credentials        
$secretAccessKeyID=&quot;1234567890ABCDEF1234567890ABCDEF12345678&quot;  
$secretKeyID=&quot;ABCDEF1234567890ABCD&quot; #Get Instance ID $instanceId=&quot;i-a1b2c3d4&quot;
</code></pre>
<p>Next, create a request object to store the new Metrics and give it a name that you will be able to easily recognize later on.</p>
<pre><code class="language-powershell">#Create request  
$request = New-Object -TypeName Amazon.CloudWatch.Model.PutMetricDataRequest   
$request.NameSpace = &quot;CUSTOM-Freespace&quot;
</code></pre>
<p>Time to collect the values.  We are going to use WMI to get the freespace from the drives on our instance.</p>
<pre><code class="language-powershell">#Get Free Space 
$freeSpace=Get-WmiObject -Class Win32_LogicalDisk | 
               Select-Object -Property DeviceID, @{Name='FreeSpaceGB';Expression={$_.FreeSpace/1GB}} | 
               Where-Object {$_.DeviceID -eq &quot;C:&quot; -or $_.DeviceID -eq &quot;D:&quot; }
</code></pre>
<p>Let’s look at what this statement is doing.</p>
<pre><code class="language-powershell">Get-WmiObject -Class Win32_LogicalDisk
</code></pre>
<p>This will return an array of all of the drives on the machine with their <em>DeviceID, DriveType, Freespace, Size</em> and <em>VolumeName</em>.</p>
<p>From this array, we are going to select the *DeviceID *and the <em>FreeSpace</em>.  However, *FreeSpace *is returning bytes and we need something that is a little easier to parse, so we are going to turn it into gigabytes.  To do this we are going to use an expression to modify the property value before we pull it back.</p>
<pre><code class="language-powershell">@{Name='FreeSpaceGB';Expression={$_.FreeSpace/1GB}}
</code></pre>
<p>Last, we are going to limit the drives that we pull back.  This part is optional, but can be useful if there are only specific drives that you are interested in.  In this case, I have narrowed it down to the C and D drives.</p>
<pre><code class="language-powershell">Where-Object {$_.DeviceID -eq &quot;C:&quot; -or $_.DeviceID -eq &quot;D:&quot; }
</code></pre>
<p>Time to start collecting this information into a way that AWS can parse it.  The first step to do this is creating some basic dimensions that will differentiate our custom metric.  To keep things simple we are going to create two dimensions, <em>Role</em> and <em>InstanceID</em>.</p>
<pre><code class="language-powershell">#Create dimensions 
$dimensions = New-Object System.Collections.ArrayList 
$dimension1 = New-Object -TypeName Amazon.CloudWatch.Model.Dimension 
$dimension2 = New-Object -TypeName Amazon.CloudWatch.Model.Dimension 

$dimension1.Name = &quot;Role&quot; 
$dimension1.Value = &quot;Test Server&quot; 
    
$dimension2.Name = &quot;InstanceID&quot; 
$dimension2.Value = $instanceId 

$dimensions.Add($dimension1) 
$dimensions.Add($dimension2)
</code></pre>
<p>With the dimensions in hand we are going to put everything together into a metric.  For each drive that we collected information on earlier, we are going to create a *MetricDatum *and populate it with the appropriate values.</p>
<pre><code class="language-powershell">#Create metrics 
$metrics = New-Object System.Collections.ArrayList 
Foreach ($item in $freeSpace) { 
     $metric = New-Object -TypeName Amazon.CloudWatch.Model.MetricDatum     
     $metric.MetricName = $item.DeviceID + &quot;free space&quot; 
     $metric.Value = $item.FreeSpaceGB $metric.Unit = &quot;Gigabytes&quot;   
     $metric=$metric.WithDimensions($dimensions) $metrics.Add($metric) 
}
</code></pre>
<p>After the metrics are created, we need to update the request object that we created earlier with the metrics data</p>
<pre><code class="language-powershell">$request = $request.WithMetricData($metrics)
</code></pre>
<p>Finally, we are going to submit all of this information back to AWS.</p>
<pre><code class="language-powershell">#Perform the request 
$client= Amazon.AWSClientFactory]::CreateAmazonCloudWatchClient($secretKeyID,$secretAccessKeyID) 
$response=$client.PutMetricData($request)
</code></pre>
<p>That’s it!  Now we have a script that will push information on available drivespace up to CloudWatch.  The first time it is run it will create the metric if it doesn’t exist.  Each subsequent time, it will just add a new data point to the existing metric.</p>
<p>The final step is scheduling this so that it runs consistently.  The easiest way to accomplish this is the built in Windows Task Scheduler.  I won’t go through the steps for that here, but I would suggest setting it up to run every 5 minutes.  If you are uncertain how to call your script, <code>powershell c:\jobs\updatedrivespacemetrics.ps1</code> will get the job done (replace c:\jobs\ with the location of your script).</p>
<img src="http://i2.wp.com/xyzpdq.org/wp-content/uploads/2013/04/cloudwatch.gif" class="img-responsive img-thumbnail">
<p>Now that you have everything set up, you can go into the AWS management console, go to the CloudWatch page and you should see your new metric.</p>
<p>Note:  You will need to set this up for each instance that you want to gather drivespace on.  Each instance will need its own copy of the script and have the task scheduler set up appropriately.  As you copy the script from instance to instance, you will need to change the InstanceID in the script to reflect the appropriate instance.</p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Small Design decisions with a large impact]]></title><description><![CDATA[<!--kg-card-begin: markdown--><p>I checked into a nice hotel yesterday. I was very impressed with the room and promptly set about unpacking my things. Because I had been traveling for nearly 20 hours straight, my devices were desperately in need of power.</p>
<div class="container">
<div class="row">
<div class="col-md-6">
<img src="https://xyzpdq-blog.s3.amazonaws.com/2016/Dec/20130303_065721-1480900897184.jpg" class="img-responsive img-thumbnail">
</div>
<div class="col-md-6">
<img src="https://xyzpdq-blog.s3.amazonaws.com/2016/Dec/20130303_070014-1480900917168.jpg" class="img-responsive img-thumbnail">
</div>
</div>
</div>
<p>The lamp next to the bed conveniently had an outlet in</p>]]></description><link>http://184.73.26.214:80/small-design-decisions-with-a-large-impact/</link><guid isPermaLink="false">5eeffd511e769b0ce1658ef6</guid><dc:creator><![CDATA[Cody Beckner]]></dc:creator><pubDate>Sun, 03 Mar 2013 11:54:45 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>I checked into a nice hotel yesterday. I was very impressed with the room and promptly set about unpacking my things. Because I had been traveling for nearly 20 hours straight, my devices were desperately in need of power.</p>
<div class="container">
<div class="row">
<div class="col-md-6">
<img src="https://xyzpdq-blog.s3.amazonaws.com/2016/Dec/20130303_065721-1480900897184.jpg" class="img-responsive img-thumbnail">
</div>
<div class="col-md-6">
<img src="https://xyzpdq-blog.s3.amazonaws.com/2016/Dec/20130303_070014-1480900917168.jpg" class="img-responsive img-thumbnail">
</div>
</div>
</div>
<p>The lamp next to the bed conveniently had an outlet in the base of the lamp.</p>
<p>I do not have a US spec power supply for my laptop, so I am forged to use an adaptor. Now, granted, the adaptor that I have is rather comically large, (my primary one broke and I was forced to find this rather inadequate replacement) but it does serve to illustrate a point.</p>
<p>By simply rotating the outlet 90 degrees, this problem would not exist. Again, I realize this is a somewhat extreme use case, but in the instance where the plug was directly on the power brick, as is the case with many devices, this would still be an issue.</p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[S3 directory browsing from a custom subdomain]]></title><description><![CDATA[<!--kg-card-begin: markdown--><p>This week I was asked to set up a site on our server with directory browsing enabled.  They also wanted to be able to upload files to said site.  Since we are already hosting our servers on AWS, I suggested to them that rather than expending the effort to write</p>]]></description><link>http://184.73.26.214:80/s3-directory-browsing-from-a-custom-subdomain/</link><guid isPermaLink="false">5eeffd511e769b0ce1658f00</guid><dc:creator><![CDATA[Cody Beckner]]></dc:creator><pubDate>Thu, 28 Feb 2013 20:30:21 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>This week I was asked to set up a site on our server with directory browsing enabled.  They also wanted to be able to upload files to said site.  Since we are already hosting our servers on AWS, I suggested to them that rather than expending the effort to write code to allow them to manage everything via the web, we set up an S3 bucket and allow them to manage the files directly from their desktop.</p>
<p>Somehow I had gotten to this point in time without working with S3.  I am aware of the principle behind it however and I knew it was capable of doing what I had proposed.</p>
<p>Creating a new bucket was easy. Configuring the permissions on the bucket to allow anonymous users was also easy.  Mapping the bucket to a sub-domain and enabling file browsing is where everything started to fall apart on me.</p>
<p>I logged on to AWS and created the bucket.  For the sake of example, let’s say it was called <code>assets.mysite.com</code>.  Something important to note here.  If you are planning on mapping the bucket to a sub-domain like I am in this example, it is typically best practice to make the name of the bucket the same as the sub-domain.</p>
<p>From there I set the permissions to enable everyone to be able to list the contents of the bucket.</p>
<p>Since I knew that my ultimate goal was to map this bucket to a sub-domain on the site, I immediately clicked down to the next tab “Static Website Hosting”, checked “Enable website hosting”, added an index document and clicked “Save”.  I copied the Endpoint that was provided on that tab <code>assets.mysite.com.s3-website-us-east-1.amazonaws.com</code>, jumped over to Route53 and added a CNAME for <code>assets.mysite.com</code> and dropped in the provided Endpoint.</p>
<p>Because I do my research, I knew that to get a list of files to show up, I would need to drop in some javascript to parse the XML content.  I grabbed some <a href="http://aws.amazon.com/code/JavaScript/1713" title="Amazon S3 bucket listing">bucket listing code from the AWS community</a> and uploaded it to the bucket.</p>
<p>Simple. Right?</p>
<p>Wrong.</p>
<p>The listing page loaded, but it didn’t show any of the content that I knew had been uploaded to the bucket.  I start going back through everything I had done, checking for the mistake.  I couldn’t find anything wrong.  I dug through the permissions thinking that perhaps something was off there.  Nothing.  So, I started going through the script and decomposing it to see if perhaps something was not working correctly with the script.  I found the following line where it pulls the list of available content:</p>
<pre><code>http.open('get', location.protocol+'//'+location.hostname);
</code></pre>
<p>Aha!  So, I tried browsing to <code>assets.mysite.com</code> and all I got was a 404 error.  Thinking at this point that perhaps configuring things as a website is what caused the problem, I went back in and disabled the website hosting feature.  I tried again and got a 404 error again.  That was obviously not the issue.</p>
<p>Eventually, after poking around for a while, I stumbled across a different endpoint <code>assets.mysite.com.s3.amazonaws.com</code>.  This returned the XML that I was expecting!  Armed with this new information, I went back and modified the line from the script that was reaching out for the XML to:</p>
<pre><code>http.open('get', 'http://assets.mysite.com.s3.amazonaws.com');
</code></pre>
<p>Surely this was going to solve my problem … or not.</p>
<p>This time I am getting a 405 error.  I immediately start googling and digging through the documentation and discover that the issue I am running into is because I am trying to do cross site scripting.  <code>assets.mysite.com</code> needs permission to talk to <code>assets.mysite.com.s3.amazonaws.com</code>. Armed with a little more information now, I run back into the S3 management console and start looking at the options available to me in the permissions tab.  I click on the “Edit CORS Configuration” button and edited the rules to include:</p>
<pre><code class="language-markup">&lt;CORSRule&gt;
    &lt;AllowedOrigin&gt;*&lt;/AllowedOrigin&gt; 
    &lt;AllowedMethod&gt;GET&lt;/AllowedMethod&gt; 
&lt;/CORSRule&gt;
</code></pre>
<p>Now when I go to the listing page, I see all of my files!  Mission successful.</p>
<p>Hopefully this can help some of you out.  Myself and a coworker tore (what’s left of) our hair out for a while going through all of the motions on this.</p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Email Templates and MailTo Syntax]]></title><description><![CDATA[<!--kg-card-begin: markdown--><p>The <strong>mailto</strong> prefix in hyperlinks.  Everyone knows about it, everyone uses it.  Not very many people are aware however that it can do more than just pass an email address to your user’s favorite email program.</p>
<p>I was in a situation the other day where I needed to prompt</p>]]></description><link>http://184.73.26.214:80/email-templates-and-mailto-syntax/</link><guid isPermaLink="false">5eeffd511e769b0ce1658ef2</guid><dc:creator><![CDATA[Cody Beckner]]></dc:creator><pubDate>Sun, 20 Jan 2013 09:34:07 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>The <strong>mailto</strong> prefix in hyperlinks.  Everyone knows about it, everyone uses it.  Not very many people are aware however that it can do more than just pass an email address to your user’s favorite email program.</p>
<p>I was in a situation the other day where I needed to prompt a user for specific information, but I did not want / need to go to the effort of writing a contact form.  Yes, I realize writing a contact form would have only taken a few minutes, but sometimes being a little bit lazy can work out well.</p>
<p>So, just to remind everybody and give us a starting place, let’s create a basic mailto link</p>
<pre><code class="language-markup">&lt;a href=&quot;mailto:youremail@domain.com&quot;&gt;Contact Us&lt;/a&gt;
</code></pre>
<p>First, let’s look at adding in more than just s single address.  There are two options here.  We can simply add multiple addresses divided by commas, or we can specify CC and BCC.</p>
<pre><code class="language-markup">&lt;a href=&quot;mailto:youremail@domain.com, [youremail2@domain.com&quot;&gt;Contact](mailto:youremail2@domain.com”&gt;Contact) Us&lt;/a&gt;

&lt;a href=&quot;mailto:youremail@domain.com?cc=yourccemail@domain.com&quot;&gt;Contact Us&lt;/a&gt;

&lt;a href=&quot;mailto:youremail@domain.com?bcc=yourbccemail@domain.com&quot;&gt;Contact Us&lt;/a&gt;
</code></pre>
<p>Now, let’s take a look at passing through a subject.  It is essentially the same principle as the CC and BCC options.</p>
<pre><code class="language-markup">&lt;a href=&quot;mailto:youremail@domain.com?subject=Please write me back&quot;&gt;Contact Us&lt;/a&gt;
</code></pre>
<p>Finally, the body of the email.  This is where my unwillingness to write a contact form comes into play.</p>
<pre><code class="language-markup">&lt;a href=&quot;mailto:youremail@domain.com?body=Sample Body Content&quot;&gt;Contact Us&lt;/a&gt;
</code></pre>
<p>I can hear you now, that’s great, but what if I want more complicated substance to my email body?  What if I need multiple paragraphs.  Well, you can do that, but it is <em>not completely supported</em> in every email client yet.  All of the more recent iterations of the email clients should have no issues however.  The trick to passing paragraph information is obviously URL encoding a line break.  You can accomplish this by placing <code>%0A</code> where you would like a line break.  Extrapolating that, <code>%0A%0A</code> would give us a paragraph break.  Let’s look at an example.</p>
<pre><code class="language-markup">&lt;a href=&quot;mailto:youremail@domain.com?body=Sample Paragraph One%0A%0ASample Paragraph Two&quot;&gt;Contact Us&lt;/a&gt;
</code></pre>
<p>Ok.  We’ve looked at all the options, so, now we can chain all of these together into our Contact Form link.  To chain multiple items together, utilize the &amp; character just like you would in a URL.</p>
<pre><code class="language-markup">&lt;a href=&quot;mailto:youremail@domain.com?cc=yourccemail@domain.com&amp;subject=Please write me back&amp;body==Sample Paragraph One%0A%0ASample Paragraph Two&quot;&gt;Contact Us&lt;/a&gt;
</code></pre>
<p>There you have it.</p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Remote Reboot of Windows 7]]></title><description><![CDATA[<!--kg-card-begin: markdown--><p>I ran into an interesting problem recently where I was (admittedly) too lazy to walk upstairs and reboot my desktop that I use as a home server.</p>
<p>I connected via remote desktop only to discover that I was not able to reboot the machine from the start menu via a</p>]]></description><link>http://184.73.26.214:80/remote-reboot-of-windows-7/</link><guid isPermaLink="false">5eeffd511e769b0ce1658eee</guid><dc:creator><![CDATA[Cody Beckner]]></dc:creator><pubDate>Wed, 24 Nov 2010 22:45:00 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>I ran into an interesting problem recently where I was (admittedly) too lazy to walk upstairs and reboot my desktop that I use as a home server.</p>
<p>I connected via remote desktop only to discover that I was not able to reboot the machine from the start menu via a RDC.</p>
<p>Curiosity, persistence (and laziness) prevailed and I spent some time researching to figure out how to get around this pesky issue.</p>
<p>Being a lover of most things related to the command prompt, I fired one up in admin mode (since like a good user, my account is not an administrator by default) and tried to reboot using</p>
<p><code>shutdown /r /t 0</code> only to be thwarted with a<code>Access is Denied(5).</code> message.</p>
<p>Researching this told me what was already plainly obvious, it was a permissions issue and I did not have permissions to do what I wanted.</p>
<p>Digging a little further, I came across the solution.</p>
<ol>
<li>Open the policy editor (secpol.msc)</li>
<li>Expand Local Policies</li>
<li>Select User Rights Assignment</li>
<li>Find the Force shutdown from a remote system policy and add your account.</li>
<li>Once this is done, go to a command prompt and run gpupdate</li>
</ol>
<p>Now, when you run shutdown /r /t 0, your remote computer will promptly heed your wishes and restart.</p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Sorting a .Net list with lambdas]]></title><description><![CDATA[<!--kg-card-begin: markdown--><p>I love linq.  I would wager that most developers who have done much with it share a similar love.</p>
<p>It makes our lives easier.  It eliminates writing copies amounts of loops, parsing xml, interacting with databases, you name it.  It’s great.</p>
<p>When I run into a situation where linq</p>]]></description><link>http://184.73.26.214:80/sorting-a-net-list-with-lambdas/</link><guid isPermaLink="false">5eeffd511e769b0ce1658eef</guid><dc:creator><![CDATA[Cody Beckner]]></dc:creator><pubDate>Wed, 07 Apr 2010 22:46:00 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>I love linq.  I would wager that most developers who have done much with it share a similar love.</p>
<p>It makes our lives easier.  It eliminates writing copies amounts of loops, parsing xml, interacting with databases, you name it.  It’s great.</p>
<p>When I run into a situation where linq does not have an obvious tie in, I start to get a little anxious,  (pathetic I realize).</p>
<p>Today I was faced with a situation where I had a list of custom classes that I needed to sort.</p>
<p>This being a list, OrderBy was nowhere to be found.</p>
<p>So, I returned to my roots and started writing a compare for my class so that I could use the Sort command that *is *part of the List generic type.  That is when it struck me that I could probably accomplish this using a lambda.  It turns out that I was correct.  Lambda’s were a perfect fit for this scenario.  The code remained clean and concise and I was saved having to write additional methods just to accomplish a one-off scenario on what was a otherwise complete project.</p>
<p>Lamdas are a progression in C# past anonymous methods.  In the code below, I show how the code would be implemented with separate function, an anonymous method and finally, with a Lambda expression.</p>
<p>First, the test class that we’ll be working with:</p>
<pre><code class="language-csharp">public class MyClass { 
     public int propA { get; set; } 
     public int propB { get; set; } 
     public int propC { get; set; } 
}
</code></pre>
<p>Our test class:</p>
<pre><code class="language-csharp">public class Tester { 
     public void Main() { 
          List list = new List(); 
          list.Add(new MyClass() { propA = 1, propB = 2, propC = 3 }); 
          list.Add(new MyClass() { propA = 2, propB = 3, propC = 4 }); 
          list.Add(new MyClass() { propA = 3, propB = 4, propC = 5 }); 
          list.Add(new MyClass() { propA = 4, propB = 5, propC = 6 }); 
     }
}
</code></pre>
<p>Now then, we write a custom comparer.  Not a <em>lot</em> of code, but you can imagine how this extrapolates over various properties for large classes.</p>
<pre><code class="language-csharp">public class Tester { 
    private static int CompareByPropA(MyClass a, MyClass b) { 
        return a.propA.CompareTo(b.propA); 
    } 
    public void Main() { 
        List list = new List(); 
        list.Add(new MyClass() { propA = 1, propB = 2, propC = 3 }); 
        list.Add(new MyClass() { propA = 2, propB = 3, propC = 4 }); 
        list.Add(new MyClass() { propA = 3, propB = 4, propC = 5 }); 
        list.Add(new MyClass() { propA = 4, propB = 5, propC = 6 }); 
        //Using the comparer built into the MyClass method
        list.Sort(CompareByPropA); 
    } 
}
</code></pre>
<p>This time, we drop the custom comparer and implement the sort using an anonymous method.</p>
<pre><code class="language-csharp">public class Tester { 
    public void Main() { 
        List list = new List(); 
        list.Add(new MyClass() { propA = 1, propB = 2, propC = 3 }); 
        list.Add(new MyClass() { propA = 2, propB = 3, propC = 4 }); 
        list.Add(new MyClass() { propA = 3, propB = 4, propC = 5 }); 
        list.Add(new MyClass() { propA = 4, propB = 5, propC = 6 }); 
        //Using an anonymous method 
        list.Sort(delegate(MyClass a, MyClass b) { 
             return a.propA.CompareTo(b.propA); }); 
    } 
}
</code></pre>
<p>Finally, we rewrite the anonymous method into a Lambda.  It is still easy to read and much more concise.</p>
<pre><code class="language-csharp">public class Tester { 
    public void Main() { 
        List list = new List(); 
        list.Add(new MyClass() { propA = 1, propB = 2, propC = 3 }); 
        list.Add(new MyClass() { propA = 2, propB = 3, propC = 4 }); 
        list.Add(new MyClass() { propA = 3, propB = 4, propC = 5 }); 
        list.Add(new MyClass() { propA = 4, propB = 5, propC = 6 }); 
        //Using a Lambda 
        list.Sort((a, b) =&gt; a.propA.CompareTo(b.propA)); 
    } 
}
</code></pre>
<p>So, next time you go to write custom compare code, step back and see if you can do it a little more simply.  If you are going to be using your comparison code in more than one place, then perhaps it is still a better idea to implement it using that route vs the lambda.  If it is a one-off thing, then a lambda and some linq can be a beautiful thing!</p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Review of Flex 4 in Action]]></title><description><![CDATA[<!--kg-card-begin: markdown--><p>As someone who has relatively little experience with Flex, I found this book to be a great asset in furthering my understanding of how and why things work the way they do not only in MXML, but also on the ActionScript side.</p>
<p>The tone of the book is very friendly.</p>]]></description><link>http://184.73.26.214:80/review-of-flex-4-in-action/</link><guid isPermaLink="false">5eeffd511e769b0ce1658ef0</guid><dc:creator><![CDATA[Cody Beckner]]></dc:creator><pubDate>Sun, 21 Mar 2010 22:47:00 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>As someone who has relatively little experience with Flex, I found this book to be a great asset in furthering my understanding of how and why things work the way they do not only in MXML, but also on the ActionScript side.</p>
<p>The tone of the book is very friendly.  As a general rule, I did not find myself bogged down in technical terminology.  A good balance was struck between keeping things simple for the average reader, yet still technical enough to hold the attention of somebody with more in depth knowledge of previous versions of Flex.</p>
<p>The book also starts out with a nice section on the benefits of Flex and RIA’s in general along with a few pointers on how to sell upper management on its use.  From there they move on to cover the basics of ActionScript and Flex and show how MXML layout works, how MXML and AS work with one another, some basics on how to work with data inside of your application and also detail the differences between the old Halo controls and the new Spark controls.</p>
<p>The second section of the book delves into more complex topics like the event model, view states, writing custom components and more.  There is a small jump in the assumed capabilities of the reading audience at this point.  Readers who are new to Flex and who have worked through the first part of the book should still be able to follow along at this point.</p>
<p>The only negative that struck me as I was reading the book was the code samples.  While they do an excellent job of demonstrating the topic that they are associated with, I would have preferred if by the time I reached the end of the book I had a fully working reference application vs a series of smaller individual apps.</p>
<p>Overall, I enjoyed reading the book and learned quite a lot, not only about Flex 4, but Flex as a whole.  The authors did a good job of keeping the subject matter entertaining and making their various contributions flow seamlessly together.</p>
<p>I would recommend this book both to people who are staring out in Flex development as well as people who are already familiar with Flex and looking to find out what is new in Flex 4.</p>
<p>You can pre-order your copy on <a href="http://web.archive.org/web/20120208145212/http://www.xyzpdq.org/ct.ashx?id=8489c6c4-27ae-4ced-b582-94632ac92474&amp;url=http%3a%2f%2fwww.amazon.com%2fFlex-4-Action-Dan-Orlando%2fdp%2f1935182420">Amazon</a></p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[OData = Hotness]]></title><description><![CDATA[<!--kg-card-begin: markdown--><p>Microsoft has officially announced <a href="http://www.odata.org">OData</a>.  If you are not aware of what this is, then in a sentence: OData is a queryable REST based interface that exposes your data via AtomPub.</p>
<p>To publish a feed, you have to use .Net.  However, they have provided client sdk’s for a variety</p>]]></description><link>http://184.73.26.214:80/odata-hotness/</link><guid isPermaLink="false">5eeffd511e769b0ce1658ef7</guid><dc:creator><![CDATA[Cody Beckner]]></dc:creator><pubDate>Wed, 17 Mar 2010 22:48:00 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>Microsoft has officially announced <a href="http://www.odata.org">OData</a>.  If you are not aware of what this is, then in a sentence: OData is a queryable REST based interface that exposes your data via AtomPub.</p>
<p>To publish a feed, you have to use .Net.  However, they have provided client sdk’s for a variety of languages to allow for simple querying of the exposed services and they are working on several more.</p>
<p>I strongly encourage you to check it out.</p>
<p>Partly to give an idea of what is possible, and partly for my own reference, I am going to repost a “cheat sheet” that I found online at <a href="http://blogs.msdn.com/alexj/default.aspx">Meta-Me</a></p>
<h6 id="theservice">The Service:</h6>
<p>It all starts with a Data Service hosted somewhere:</p>
<pre><code>http://server/service.svc
</code></pre>
<h6 id="basicqueries">Basic queries:</h6>
<p>You access the Data Service entities through resource sets, like this:</p>
<pre><code>http://server/service.svc/People
</code></pre>
<p>You request a specific entity using its key like this:</p>
<pre><code>http://server/service.svc/People(16)
</code></pre>
<p>Or by using a reference relationship to something else you know:</p>
<pre><code>http://server/service.svc/People(16)/Mother
</code></pre>
<p>This asks for person 16’s mother.</p>
<p>Once you have identified an entity you can refer to it’s properties directly:</p>
<pre><code>http://server/service.svc/People(16)/Mother/Firstname
</code></pre>
<h6 id="value">$value:</h6>
<p>But the last query wraps the property value in XML, if you want just the raw property value you append <code>$value</code> to the url like this:</p>
<pre><code>http://server/service.svc/People(16)/Mother/Firstname/$value
```
###### $filter:

You can filter resource sets using `$filter`:
```
http://server/service.svc/People?$filter=Firstname  eq ‘Fred’
```
Notice that strings in the filter are single quoted.

Numbers need no quotes though:
```
http://server/service.svc/Posts?$filter=AuthorId eq 1
```
To filter by date you have identity the date in the filter, like this:
```
http://server/service.svc/Posts?$filter=CreatedDate eq DateTime’2009-10-31′
```
You can filter via reference relationships:
```
http://server/service.svc/People?$filter=Mother/Firstname eq ‘Wendy’
```
The basic operators you can use in a filter are:

&lt;table class=&quot;table table-striped&quot;&gt;&lt;tbody&gt;&lt;thead&gt;&lt;tr&gt;&lt;th&gt;Operator&lt;/th&gt;&lt;th&gt;Description&lt;/th&gt;&lt;th&gt;C# equivalent&lt;/th&gt;&lt;/tr&gt;&lt;/thead&gt;&lt;/tbody&gt;&lt;tbody&gt;&lt;tr&gt;&lt;td&gt;eq&lt;/td&gt;&lt;td&gt;**eq**uals&lt;/td&gt;&lt;td&gt;==&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;ne&lt;/td&gt;&lt;td&gt;**n**ot **e**qual&lt;/td&gt;&lt;td&gt;!=&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;gt&lt;/td&gt;&lt;td&gt;**g**reater **t**han&lt;/td&gt;&lt;td&gt;&gt;&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;ge&lt;/td&gt;&lt;td&gt;**g**reater than or **e**qual&lt;/td&gt;&lt;td&gt;&gt;=&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;lt&lt;/td&gt;&lt;td&gt;**l**ess **t**han&lt;/td&gt;&lt;td&gt;&lt;&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;le&lt;/td&gt;&lt;td&gt;**l**ess than or **e**qual&lt;/td&gt;&lt;td&gt;&lt;=&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;and&lt;/td&gt;&lt;td&gt;and&lt;/td&gt;&lt;td&gt;&amp;&amp;&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;or

&lt;/td&gt;&lt;td&gt;or&lt;/td&gt;&lt;td&gt;||&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;()

&lt;/td&gt;&lt;td&gt;grouping&lt;/td&gt;&lt;td&gt;()&lt;/td&gt;&lt;/tr&gt;&lt;/tbody&gt;&lt;/table&gt;There are also a series of functions that you can use in your filters if needed.

###### $expand:

If you want to include related items in the results you use $expand like this:
```
http://server/service.svc/Blogs?$expand=Posts
```
This returns the matching Blogs and each Blog’s posts.

###### $select:

Some Data Services allow you to limit the results to just the properties you require – aka projection – for example if you just want the Id and Title of matching Posts you would need something like this:
```
http://server/service.svc/Posts?$select=Id,Title
```
You can even project properties of related objects too, like this:
```
http://server/service.svc/Posts?$expand=Blog&amp;$select=Id,Title,Blog/Name
```
This projects just the Id, Title and the Name of the Blog for each Post.

###### $count:

If you just want to know how many records would be returned, without retrieving them you need `$count`:
```
http://server/service.svc/Blogs/$count
```
Notice that `$count` becomes one of the segments of the URL – it is not part of the query string – so if you want to combine it with another operation like `$filter` you have to specify `$count` first, like this:
```
http://server/service.svc/Posts/$count?$filter=AuthorId eq 6
```
This query returns the number of posts authored by person 6.

###### $orderby:

If you need your results ordered you can use `$orderby`:
```
http://server/service.svc/Blogs?$orderby=Name
```
Which returns the results in ascending order, to do descending order you need:
```
http://server/service.svc/Blogs?$orderby=Name%20desc
```
To filter by first by one property and then by another you need:
```
http://server/service.svc/People?$orderby=Surname,Firstname
```
Which you can combine with **desc** if necessary.

###### $top:

If you want just the first 10 items you use `$top` like this:
```
http://server/service.svc/People?$top=10
```
###### $skip:

If you are only interested in certain page of date, you need `$top` and `$skip` together:
```
http://server/service.svc/People?$top=10&amp;$skip=20
```
This tells the Data Service to skip the first 20 matches and return the next 10. Useful if you need to display the 3rd page of results when there are 10 items per page.

**Note:** It is often a good idea to combine `$top` &amp; `$skip` with `$orderby` too, to guarantee the order results are retrieved from the underlying data source is consistent.

###### $inlinecount &amp; $skiptoken:

Using `$top` and `$skip` allows the client to control paging.

But the server also needs a way to control paging – to minimize workload need to service both naive and malicious clients – the OData protocol supports this via [Server Driven Paging](http://blogs.msdn.com/astoriateam/archive/2009/03/19/ado-net-data-services-v1-5-ctp1-server-driven-paging.aspx).

With Server Driven Paging turned on the client might ask for every record, but they will only be given one page of results.

This as you can imagine can make life a little tricky for client application developers.

If the client needs to know how many results there really are, they can append the `$inlinecount` option to the query, like this:
```
http://server/service.svc/People?$inlinecount=allpages
</code></pre>
<p>The results will include a total count ‘inline’, and a url generated by the server to get the next page of results.<br>
This generated url includes a $skiptoken, that is the equivalent of a cursor or bookmark, that instructs the server where to resume:</p>
<pre><code>http://server/service.svc/People?$skiptoken=4
</code></pre>
<h6 id="links">$links</h6>
<p>Sometime you just need to get the urls for entities related to a particular entity, which is where $links comes in:</p>
<pre><code>http://server/service.svc/Blogs(1)/$links/Posts
</code></pre>
<p>This tells the Data Service to return links – aka urls – for all the Posts related to Blog 1.</p>
<h6 id="metadata">$metadata</h6>
<p>If you need to know what model an OData compliant Data Service exposes, you can do this by going to the root of the service and appending <code>$metadata</code> like this:</p>
<pre><code>http://server/service.svc/$metadata
</code></pre>
<p>This should return an <a href="http://download.microsoft.com/download/B/0/B/B0B199DB-41E6-400F-90CD-C350D0C14A53/%5BMC-EDMX%5D.pdf">EDMX</a> file containing the conceptual model (aka EDM) exposed by the Data Service.</p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[ColdFusion Troubles]]></title><description><![CDATA[<!--kg-card-begin: markdown--><p>I have been running awry of ColdFusion.</p>
<p>I am running Windows XP (not by choice), and <em>had</em> a local installation of ColdFusion 8 Server.  The instance of ColdFusion was stopped since it is a bit of a memory hog and I was not actively using it.  I have been working</p>]]></description><link>http://184.73.26.214:80/coldfusion-troubles/</link><guid isPermaLink="false">5eeffd511e769b0ce1658eea</guid><dc:creator><![CDATA[Cody Beckner]]></dc:creator><pubDate>Fri, 26 Feb 2010 22:49:00 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>I have been running awry of ColdFusion.</p>
<p>I am running Windows XP (not by choice), and <em>had</em> a local installation of ColdFusion 8 Server.  The instance of ColdFusion was stopped since it is a bit of a memory hog and I was not actively using it.  I have been working on getting an instance of Telligent Community Server up and running for evaluation purposes.  I was baffled by the fact that the Telligent demo was taking around 10 minutes on average to load a page from the local system.</p>
<p>Needless to say, since everything else seemed to be running fine, I was blaming Telligent’s software.  Then I started exploring a little more and figure out that IIS was taking around 30 seconds just to serve up an image.  Something was definitely up, but despite all of my best troubleshooting skills, I could not source the problem.</p>
<p>I did what I always do when something goes wrong that I don’t understand, I turned to Google.</p>
<p>I tried about 20 different “solutions” until I stumbled across a forum post saying to start ColdFusion if it was installed.  <em>poof</em>. The page that took 10 minutes to load before now takes 5 seconds.  It turns out that the CF ISAPI plugin is constantly trying to talk to the server.  If it can’t find the server, it doesn’t just die, it keeps trying. … on every. single. request.</p>
<p><strike>strike one.</strike></p>
<p>A short while later, I was working on trying to speed up a dashboard application that  coworker and I had written.  It is pretty simple. CF queries the database and retrieves somewhere in the range of 10-700 rows of data, turns them into objects, passes those objects off to Flex which then graphs the objects.  It was performing fine as long as there was less than 40 rows. By the time you got up to a few hundred rows of data, it would take an obscene amount of time to load.</p>
<p>We scratched our heads for the longest time and were pretty convinced that the Flex chart control was to blame.</p>
<p>Then we had an idea.  Instead of letting CF instantiate each row into an object, just send CF the results as an XML document from the database, let CF hand the XML off to Flex, and then proceed from there.</p>
<p>Same story. 10 minutes now became 2 seconds.</p>
<p>It seems that ColdFusion doesn’t do so well at creating objects.</p>
<p><strike>strike two.</strike></p>
<p>I’m not very pleased with ColdFusion at the moment.</p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Response.Redirect inside of a Try/Catch]]></title><description><![CDATA[<!--kg-card-begin: markdown--><p>Ok, first off, let me state, that I am not entirely certain WHY I did this in the first place.  I am going to say I was rushed and wasn’t thinking clearly.  However, I glazed over it and moved on and when it came time to test, I was</p>]]></description><link>http://184.73.26.214:80/response-redirect-inside-of-a-trycatch/</link><guid isPermaLink="false">5eeffd511e769b0ce1658ef1</guid><dc:creator><![CDATA[Cody Beckner]]></dc:creator><pubDate>Fri, 27 Mar 2009 22:49:00 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>Ok, first off, let me state, that I am not entirely certain WHY I did this in the first place.  I am going to say I was rushed and wasn’t thinking clearly.  However, I glazed over it and moved on and when it came time to test, I was getting bizarre behavior that didn’t throw an error.</p>
<pre><code class="language-csharp">Session[“var”] = “”; 
try { 
    Session[“var”] = “Good Value”; 
    Response.Redirect(“newpage.html”); 
} 
catch(Exception ex) 
{ 
    Session[“var”] = “Bad Value”; 
}
</code></pre>
<p>In the code above, <code>Session[“var”]</code> will ALWAYS equal “Bad Value”.  Why you may ask?</p>
<pre><code class="language-csharp">Response.Redirect throws a ThreadAbortException.  … Fun, no?
</code></pre>
<p>If you are building a URL inside of a try block that you want to then redirect to, declare a string, build your url, and then pass that on to the Response.Redirect statement.</p>
<p>For Example:</p>
<pre><code class="language-csharp">string _url; 
try {
    _url = “yourpage.aspx?var=” + iffyMethodCall(); 
} 
catch(InvalidOperationException ex) 
{
    _url = “error.html”; 
} 
Response.Redirect(url);
</code></pre>
<p>So, If you ever run into odd behavior in a site you are working on and when debugging, your code goes straight past your Response.Redirect and into the catch block and the debugger starts giving you cryptic messages, this may be what you’re seeing.</p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[IIS Worker Process fail 503 Error]]></title><description><![CDATA[<!--kg-card-begin: markdown--><p>Helping a friend to configure his new Win2k8 Server with IIS7 this weekend we ran into an issue where IIS kept returning 503 errors.<br>
Examining the Application Event log I saw that  IISW3SVC-WP was quitting due to application errors. “The error is the data”.  How helpful.<br>
After a considerable amount</p>]]></description><link>http://184.73.26.214:80/iis-worker-process-fail-503-error/</link><guid isPermaLink="false">5eeffd511e769b0ce1658ef5</guid><dc:creator><![CDATA[Cody Beckner]]></dc:creator><pubDate>Sun, 07 Dec 2008 22:50:00 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>Helping a friend to configure his new Win2k8 Server with IIS7 this weekend we ran into an issue where IIS kept returning 503 errors.<br>
Examining the Application Event log I saw that  IISW3SVC-WP was quitting due to application errors. “The error is the data”.  How helpful.<br>
After a considerable amount of digging I discovered that in the Microsoft.Net/Framework/ folder, there was a beta version of v2 of the framework.  The only version that <em>should</em> be there is v2.0.50727.<br>
I deleted the beta version of the framework and restarted the IIS worker processes that were causing problems and everything immediately burst to life!<br>
I’ve no clue what installed the beta version of the framework, but it is a definite lesson to make sure that when distributing a framework to be 100% certain that you are always including the latest “release” version… oh yeah, and perhaps doing a check for an existing version of said framework before installing.<br>
At any rate, hope this helps somebody.</p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Useless entry #268]]></title><description><![CDATA[<!--kg-card-begin: markdown--><p>While in a chat today, the following conversation occurred…</p>
<p>After I stopped laughing I had to post it.</p>
<p>A : I had a mouse in my well the other day. I shop-vac-d it out.</p>
<p>B : lol, I can hear it now… “whrrrrrrrrr, ssshhTHUNK”</p>
<p>C : <em>phoomp</em></p>
<p>D : and forever after the other</p>]]></description><link>http://184.73.26.214:80/useless-entry-268/</link><guid isPermaLink="false">5eeffd511e769b0ce1658eff</guid><dc:creator><![CDATA[Cody Beckner]]></dc:creator><pubDate>Thu, 04 Dec 2008 22:51:00 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>While in a chat today, the following conversation occurred…</p>
<p>After I stopped laughing I had to post it.</p>
<p>A : I had a mouse in my well the other day. I shop-vac-d it out.</p>
<p>B : lol, I can hear it now… “whrrrrrrrrr, ssshhTHUNK”</p>
<p>C : <em>phoomp</em></p>
<p>D : and forever after the other mice tell tales of abduction from above</p>
<p>D : “seriously, it was like some kind of tractor beam!”</p>
<p>B : But is derided as a crazy mouse.</p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Whedon Quote]]></title><description><![CDATA[<!--kg-card-begin: markdown--><p>I am not sure how many of you are <a href="http://web.archive.org/web/20120208145212/http://www.xyzpdq.org/ct.ashx?id=e31abdeb-4f9e-44da-88c3-d4e8db150028&amp;url=http%3a%2f%2fwhedonesque.com%2f">Whedon</a> fans like myself.  I came across this today and I’m posting it in part because it is, in my opinion, phenomenal, and also so that I have an easy place to look it up in the future.</p>
<blockquote>
<p>Passion, it</p></blockquote>]]></description><link>http://184.73.26.214:80/whedon-quote/</link><guid isPermaLink="false">5eeffd511e769b0ce1658ef9</guid><dc:creator><![CDATA[Cody Beckner]]></dc:creator><pubDate>Mon, 08 Sep 2008 22:51:00 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>I am not sure how many of you are <a href="http://web.archive.org/web/20120208145212/http://www.xyzpdq.org/ct.ashx?id=e31abdeb-4f9e-44da-88c3-d4e8db150028&amp;url=http%3a%2f%2fwhedonesque.com%2f">Whedon</a> fans like myself.  I came across this today and I’m posting it in part because it is, in my opinion, phenomenal, and also so that I have an easy place to look it up in the future.</p>
<blockquote>
<p>Passion, it lies in all of us, sleeping… waiting… and though unwanted… unbidden… it will stir… open its jaws and howl. It speaks to us… guides us… passion rules us all, and we obey. What other choice do we have? Passion is the source of our finest moments. The joy of love… the clarity of hatred… and the ecstasy of grief. It hurts sometimes more than we can bear. If we could live without passion maybe we’d know some kind of peace… but we would be hollow… Empty rooms shuttered and dank. Without passion we’d be truly dead.</p>
</blockquote>
<blockquote></blockquote>
<blockquote>
<p>– Joss Whedon</p>
</blockquote>
<!--kg-card-end: markdown-->]]></content:encoded></item></channel></rss>