Michael Friis' Blog

About


EC2 SQL Server backup strategies and tactics

The many backup modes offered by Microsoft SQL server, combined with the prodigious hardware options offered on Amazon EC2 can make choosing a backup strategy for your setup a little confusing. In this post, I’ll sketch some options and end with a simple PowerShell script usable on both Express and Standard versions, that’ll backup your database to S3.

To start with, you should probably be running your database off an EBS (Elastic Block Storage) volume. They can sustain many more random IOPS than instance disks (good for typical workloads) and they live independently of your instances. While i haven’t had an instance die from under me, if one should cop it, all data on the local disks will be gone-gone.

EBS volumes can fail too however, and will do se at an annualised rate of 0.1% to 0.5% according to Amazon. You may decide this is good enough for your purposes and leave it at that. Bear in mind, however, that this failure rate is compounded by other factors such as Windows or SQL Server malfunctioning and corrupting the volume, you pressing the wrong button in AWS console/Management Studio, a disgruntled employee doing it on purpose or something else entirely. In other words, you should take backups.

A simple approach is to use the snapshotting feature of EBS. This basically saves the (diff of the) contents of your volume to S3, from whence it can be restored to life if something happens to the volume. I’ve used this to muck around with test-environments and such. It works fine and could conceivably be automated using the AWS API. It’s a rather low-level approach though, and you could easily find yourself restoring from a snapshot taken with SQL Server’s pants around its ankles, in the middle of a transaction. While obviously capable of recovering from such an indescretion and rolling back to a safe state, this can be something of a hassle.

Another option is to do normal backups to another EBS volume mounted on the same instance. While I have no knowledge of Amazon datacenter topologies, one could fear that different EBS volumes attached to the same instance end up being hosted on the same EBS-SAN-thingamebob, the death of which would then also be your undoing.

You could also copy backup-files to another instance mounting its own EBS volume, or set up replication — allowing you to recover very quickly. Note that SQL Server Express can subscribe to a Standard instance in a replication setup, although it cannot publish. Your replicated instance could even live in a different availability zone, although you would then incur bandwidth cost on exchanged data on top of the cost of running an extra instance.

The approach we ended up taking uses S3 however. Amazon promises S3 to be very safe (“no single point of failure”) and has the added benefit of being available independently of EC2 instances. To do a backup, we basically do a full database backup to one of the local disks and then move the file to S3. This is handled by a PowerShell script invoked as a scheduled task, making it usable on SQL Server Express instances (where the native SQL Server backup scheduling is not otherwise available). To handle the S3 interaction, we use the free CloudBerry snap-in. A few gotchas:

  1. If you’re running on a X64 system, install the snap-in with that .Net version
  2. You probably have to modify the PowerShell script execution policy on your instance
  3. You need the DotNetZip lib for zipping

Some possible improvements are zipping of files and shrinking of logfile before upload (*both added February 1. 2009*) and perhaps an incremental backup scheme.

Script is included below.

# This Powershell script is used to backup a SQL Server database and move the backup file to S3
# It can be run as a scheduled task like this:
# C:\WINDOWS\system32\WindowsPowerShell\v1.0\powershell.exe &'C:\Path\dbbackup.ps1'
# Written by Michael Friis (http://friism.com)

$key = "yourkey"
$secret = "yoursecret"
$localfolder = "C:\path\tobackupfolder"
$s3folder = "somebucket/backup/"
$name = Get-Date -uformat "backup_%Y_%m_%d"
$filename = $name + ".bak"
$zipfilename = $name + ".zip"
$dbname = "yourdb"
$dblogname = "yourdb_log"
$ziplibloc = "C:\pathto\ziplib\Ionic.Utils.Zip.dll"

# Remove existing db backup file
if(Test-Path -path ($localfolder + "\" + $filename)) { Remove-Item ($localfolder + "\" + $filename) }

$query =
"
USE {2}
GO

DBCC SHRINKFILE({3})

GO

BACKUP DATABASE [{2}] TO  DISK = N'{0}\{1}'
        WITH NOFORMAT, NOINIT,  NAME = N'backup', SKIP, REWIND, NOUNLOAD,  STATS = 10
GO
declare @backupSetId as int
select @backupSetId = position from msdb..backupset
where database_name=N'{2}' and backup_set_id=(select max(backup_set_id)
from msdb..backupset where database_name=N'{2}' )

if @backupSetId is null
begin
        raiserror(N'Verify failed. Backup information for database ''{2}'' not found.', 16, 1)
end
RESTORE VERIFYONLY FROM  DISK = N'{0}\{1}'
        WITH  FILE = @backupSetId,  NOUNLOAD,  NOREWIND" -f $localfolder, $filename, $dbname, $dblogname

sqlcmd -Q $query -S "."

# Remove existing zip file
if(Test-Path -path ($localfolder + "\" + $zipfilename)) { Remove-Item ($localfolder + "\" + $zipfilename) }

#Zip the backup file
[System.Reflection.Assembly]::LoadFrom($ziplibloc);
$zipfile =  new-object Ionic.Utils.Zip.ZipFile($localfolder + "\" + $zipfilename);
$e= $zipfile.AddFile($localfolder + "\" + $filename)
$zipfile.Save()

#Upload to S3
Add-PSSnapin CloudBerryLab.Explorer.PSSnapIn
$s3 = Get-CloudS3Connection -Key $key -Secret $secret
$destination = $s3 | Select-CloudFolder -path $s3folder
$src = Get-CloudFilesystemConnection | Select-CloudFolder $localfolder
$src | Copy-CloudItem $destination –filter $zipfilename

Querying relationships with LinqtoCRM

I’ve just recorded a web cast demontrating joins with LinqtoCRM. The piece de resistance is a join across a many-to-many relationship with the intermediary class generated by CRMMetal:

var res = from u in p.Linq()
	join sr in p.Linq() on u.systemuserid.Value equals sr.systemuserid.Value
	join r in p.Linq() on sr.roleid.Value equals r.roleid.Value
        select new { u.fullname, r.name };

The equivalent example query in the CRM SDK is around forty lines, compared to four for LinqtoCRM. Watch the web cast here.

Installing IIS 6 SMTP service on an Amazon EC2 instance

The standard Amazon Windows AMIs don’t come with the IIS 6 SMTP component installed. It can be added through the “Add or Remove Programs”->”Add/Remove Windows Components” util on Windows Server 2003 (full guide), but you need the installation media. Amazon has an article describing how to do just that here. You basically create an EBS volume from a snapshot they provide (2GB minimum size) and then attach it to your instance. It will show up as a drive in Windows, holding the contents of the two installation disks. Just point the installer at those and you should be good. Afterwards you can detach and delete the EBS volume.

Techniques for unique, correct and fast geo-queries II

You may remember my last post on this topic ended with a question. Just to recap, we have a table with a lot of rows that have geographical coordinates and we want to find a random subset that lies in a given map window. The problem with the query demonstrated last was that there are rows with the exact same coordinates and for those coordinates the query would always return the same row (namely the one with the highest intid).

I posted a question on Stackoverflow and Tom H. came up with a solution that was 90% there. The full query looks like this:

with cte as
(
    select
        intid,
        row_number() over
            (
                    partition by geoLat, geoLng
                    order by newid()
            ) as row_num,
        count(intid) over (partition by geoLat, geoLng) as TotalCount
    from
        documents d
	where
		d.geoLat < @maxLat
			and d.geoLat > @minLat
			and
			(
				((@maxLng > @minLng) and
					(d.geoLng < @maxLng and d.geoLng > @minLng))
				or
				((@maxLng < @minLng) and
					((d.geoLng > @minLng and d.geoLng < 180)
						or (d.geoLng > -180 and d.geoLng < @maxLng))
					)
			)
)
select top (@maxcount)
    dd.*, cte.intid, rand(cte.intid)
from
    cte,documents dd
where
    row_num = 1 + floor(rand() * TotalCount) and cte.intid = dd.intid
order by newid()

The query uses Common Table Expressions, which I’ve dabbled in before. Looking at the execution plan makes my head hurt, but it’s at least as fast as the original version. See the new version in action at the TEDBot site.

LinqtoCRM 0.3.0 released

There’s a new version of LinqtoCRM out, get it here. Petteri Räty has rewritten the query-generation engine so that arbitrary selectors are supported. He has also squashed some nasty predicate-bugs and added more unit tests. I’ve implemented a CRMMetal tool that generates classes for many-to-many relationships and joins across these are now supported.

I’ve also recorded a new screencast for your amusement.

Downloading the EU

… or parts of it anyway.

The European Union is generally up to lots of weird and wonderful things, one of the more esoteric being the “Tenders Electronic Daily” (TED) database. Basically, all public procurement above a certain value has to go through this database so that companies from all over the world have a fair chance of winning the contracts. Once a tender is won, the names and addresses of the winning companies are also posted. This, in the hope that Europeans will get more roads for their tax-euros and that less money disappear into the mayors cousins pocket.

The database is hosted at http://ted.europa.eu/. It’s publicly available, but you can’t obtain the data in bulk unless you pay a lot of money. This post describes how data can be scraped en masse through the web interface. I’ve written it partly to brag, partly to help you out on the off chance that you someday need to analyse these contracts (in which case I sincerely hope you stumble on this post).

A typical won tender looks like this (you need to be logged in, registration is free):
http://ted.europa.eu/Exec?DataFlow=N_list_results.dfl&Template=TED/N_result_details_data.htm&Page=1&docnumber=2005238380&StatLang=EN

http://ted.europa.eu/Exec?DataFlow=N_list_results.dfl&Template=TED/N_result_details_curr.htm&Page=1&docnumber=2005238380&StatLang=EN

Notice that there are two interesting pages per contract, one has structured data, the other has unstructured text (the “Document family” stuff is also pretty interesting, but we won’t go into that here). Also note the “docnumber” url parameter. It actually turns out to be composed of two parts, the year (2005 for this particular contract) and an integer (238380). The integer part starts in January from 000001 and is then incremented as contracts accrue through the year. There are very few holes in this sequence, although it seems a few documents are redacted each year. The problem now looks as simple as determining the maximum docnumber for each year and wade through all of them, downloading both data and text parts, right?

The login-requirement will foil you unfortunately — to get at a contract more than a few weeks old, you need to be logged in. Being considered as “logged in” by TED turns out to entail posting a valid JSESSIONID cookie. The browser obviously accomplishes this automatically when you interact with the web site, but replicating it in a scraper proved to be extraordinarily difficult — in fact, I spent the better part of a week looking at WireShark traces before I managed to reverse engineer the process. There are three steps:

  1. Post username and password to http://ted.europa.eu/eLogin
  2. Simulate a search by posting to http://ted.europa.eu/ExecSessionVariables
  3. Simulate browsing the search result by getting http://ted.europa.eu/Exec?DataFlow=ShowPage.dfl&TableName=TED_EN&Template=TED/N_result_list.htm&Page=1&toconf=yes

Any sane login process would obviously stop after the first step, but for some reason you have to built state for your session variable on the server before you are allowed to retrieve random documents. The exact nature of the search being simulated is completely irrelevant, I just use a giant string mined from a random WireShark trace. The web site is built on some arcane Java stack — the convolutedness of which must be pretty amazing (posting incomplete requests will net you a stack trace that also reveals the supplier, who shall go unnamed here). Below, I’ve posted a C# class that will do the right thing. Note the gotcha that HttpWebRequest will not absorb cookies received during posts, so you have to record these yourself.

Once you can get valid session ids, getting all the data is pretty straight forward, although it will probably take a while. I highly recommend requesting gzipped data from the EU server (by including “Accept-Encoding” – “gzip” in the header) as it will drastically cut down on bandwidth usage. You can safely have multiple scrapers hammering the website concurrently, I found 10 to be a good number. Note that the servers are apparently rebooted around midnight CET each day during which time requests tend to fail.

Remember that you can browse the contract information on Google Maps here: http://tedbot.itu.dk/

And the code:

using System;
using System.Text;
using System.Net;
using System.IO;
using System.Net.Security;
using System.Security.Principal;

namespace TED.Scraper
{
    class Authenticator
    {
        static string userid = "username";
        static string password = "password";

        public enum MethodType { Get, Post };

        static string extSearchq = @"Name=statisticMode%40fulltext_textfield"+
            "%40OJ_textfield%40country_textfield"+
            "%40place_textfield%40contract_textfield"+
            "%40procedure_textfield%40document_textfield"+
            "%40regulation_textfield%40CPV_textfield%40NUTS_textfield"+
            "%40publication_textfield%40docnumber_textfield%40datedoc_textfield"+
            "%40deadline_textfield%40type_author_textfield%40name_author_textfield"+
            "%40heading_textfield%40activity_textfield%40fulltext_textfield_hid"+
            "%40OJ_textfield_hid%40country_textfield_hid%40place_textfield_hid"+
            "%40contract_textfield_hid%40procedure_textfield_hid%40document_textfield_hid"+
            "%40regulation_textfield_hid%40CPV_textfield_hid%40NUTS_textfield_hid"+
            "%40publication_textfield_hid%40docnumber_textfield_hid%40datedoc_textfield_hid"+
            "%40deadline_textfield_hid%40type_author_textfield_hid%40name_author_textfield_hid"+
            "%40heading_textfield_hid%40activity_textfield_hid%40docLang%40maxRow%40SelRetrieval" +
            "%40FTIndex%40SearchFrom%40ExpertQry%40op1%40op2%40Query%40ErrorMes%40AdviceMes&Value=No" +
            "%40null%40null%40null%40null%40null%40null%40null%40null%40null%40null%40null%40null%40null" +
            "%40null%40null%40null%40null%40null%40null%40null%40null%40null%40null%40null%40null%40null" +
            "%40null%40null%40null%40null%40null%40null%40null%40null%40null%40null%40EN%4025%40OJ%2CND%2CTI" +
            "%40TEDINDEX_ARCHIVE2%40extended%40null%40AND%40AND%40cs_type%3Adb+%40null" +
            "%40null&Language=EN&Redirect=Exec%3FDataFlow%3DShowPage.dfl%26TableName%3DTED_EN" +
            "%26Template%3DTED%2FN_result_list.htm%26Page%3D1%26toconf" +
            "%3Dyes&ErrorMes=null&AdviceMes=null&RedirectError=Exec" +
            "%3FDataFlow%3DShowPage.dfl%26Template%3DTED%2Fextended_search%26BANNER%3DEXT";

        public static string GetAuthenticatedSessionId()
        {
            CookieContainer cc = new CookieContainer();

            DoHttpRequest("http://ted.europa.eu/eLogin",
                MethodType.Post,
                true,
                String.Format("USERID={0}&PASSWORD={1}", userid, password),
                cc);

            DoHttpRequest("http://ted.europa.eu/ExecSessionVariables",
                            MethodType.Post,
                            true,
                            extSearchq,
                            cc);

            DoHttpRequest("http://ted.europa.eu/Exec?" +
                "DataFlow=ShowPage.dfl&TableName=TED_EN&Template=TED/N_result_list.htm&Page=1&toconf=yes",
                MethodType.Get,
                false,
                null,
                cc);

            string sessionid = cc.GetCookies(new Uri("http://ted.europa.eu"))["JSESSIONID"].Value;
            return sessionid;
        }

        private static void DoHttpRequest(string url, MethodType mtype,
            bool isFormEncoded, string requestString, CookieContainer cc)
        {
            HttpWebRequest req = (HttpWebRequest)WebRequest.Create(url);
            req.Method = mtype.ToString();
            req.CookieContainer = cc;

            req.AllowAutoRedirect = false;
            req.AllowWriteStreamBuffering = true;
            req.AuthenticationLevel = AuthenticationLevel.None;
            req.ImpersonationLevel = TokenImpersonationLevel.None;
            req.UserAgent =
                "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-GB; rv:1.9b4) Gecko/2008030714 Firefox/3.0b4";

            req.Proxy = null;

            if (isFormEncoded)
            {
                req.ContentType = "application/x-www-form-urlencoded";
                req.Expect = null;

                byte[] reqData = Encoding.UTF8.GetBytes(requestString);
                req.ContentLength = reqData.Length;
                Stream reqStream = req.GetRequestStream();
                reqStream.Write(reqData, 0, reqData.Length);
                reqStream.Close();
            }

            try
            {
                HttpWebResponse response = (HttpWebResponse)req.GetResponse();
                if (isFormEncoded)
                {
                    // seems that cookies received in post are not saved to collection
                    req.CookieContainer.Add(new Uri("http://ted.europa.eu"), response.Cookies);
                }

                response.Close();
            }
            catch (Exception e)
            {
                throw;
            }
        }

        private static void AddRequestString(HttpWebRequest request, string requeststring)
        {
            byte[] reqData = Encoding.UTF8.GetBytes(requeststring);
            request.ContentLength = reqData.Length;
            Stream reqStream = request.GetRequestStream();
            reqStream.Write(reqData, 0, reqData.Length);
            reqStream.Close();
        }
    }
}

Smuggling exceptions out of try-catches with IEnumerable

The code in this post is from a talk by Erik Meijer, I claim no credit.

Take a look at this screenshot:

Exception thrown from try-catch

Pretty wacky huh? Visual Studio has broken into debug mode even though the division is clearly wrapped in a try-catch. Take a guess at what’s going on before reading on.

Here’s the full code listing:

static void Main(string[] args)
{
    var xs = new[] { 1, 2, 3, 0 };
    IEnumerable q = null;

    try
    {
        q = from x in xs select 1 / x;
    }
    catch { }

    foreach (var z in q)
    {
        Console.WriteLine(z);
    }
}

The obvious expectation is that division-by-zero will be caught by the try-catch. Alas, the division does not happen in the scope of the try statement. Instead, due to the deferred execution paradigm of IEnumerable, it happens on a by-need basis in the for-loop. From there the exception can propogate unopposed to the top af the stack. Haskell, where lazy evaluation dominates, avoids exceptions (and other messy side effects) alltogether, partly for reasons demonstrated by this example.

UPDATE: As Wayne points out, fixing this is as easy as extending the try to cover the enumeration. The point of this post was mainly to poke fun at the Visual Studio debugger and to point out that lazy evaluation can bite you if you’re not careful.

CRMMetal and LINQtoCRM

As you may know, Microsoft introduced many-to-many relationships in version 4.0 of Dynamics CRM. Unfortunately, querying these relationships is not supported through the general web service entities as the intermediary entities are not exposed. Fetch XML works fine however (and don’t forget this great hack to generate FetchXML). The missing entities meant that LINQtoCRM didn’t support many-to-many relationships either, even though the underlying query-engine generates Fetch XML.

To remedy this deficiency, a simple tool is now bundled with experimental versions of LINQtoCRM. In the tradition of the other major query providers, it’s called “CRMMetal”. It works by asking the metadata web service for all the metadata, including relationships, and then filtering out the many-to-many ones:

RetrieveAllEntitiesRequest allEntitiesRequest = new RetrieveAllEntitiesRequest();
allEntitiesRequest.RetrieveAsIfPublished = false;
allEntitiesRequest.MetadataItems = MetadataItems.IncludeRelationships;

RetrieveAllEntitiesResponse allEntitiesResponse = 
	(RetrieveAllEntitiesResponse)service.Execute(allEntitiesRequest);

var mtom = allEntitiesResponse.CrmMetadata.OfType().
	SelectMany(e => e.ManyToManyRelationships, (e, d) =>
		new
		{
			intersectname = d.IntersectEntityName,
			schemaname = d.SchemaName.Replace("_association",""),
			ent1Name = d.Entity1LogicalName,
			ent2Name = d.Entity2LogicalName,
			ent1Att = d.Entity1IntersectAttribute,
			ent2Att = d.Entity2IntersectAttribute
		}
		).Distinct().OrderBy(s => s.schemaname);

It then uses the CodeDOM API to generate classes similar to the web service ones, although the property bodies are empty and there’s no XML serialization attributes. It seems CodeDOM has not been updated with recent .Net releases: You can’t generate automatic properties and the API doesn’t support declarative composition of code, a great shame I think. Other than that, generating the code is pretty straightforward:

CodeCompileUnit targetUnit = new CodeCompileUnit();
string nameSpace = "LinqtoCRMApplication.CRM";

CodeNamespace ns = new CodeNamespace(nameSpace);
ns.Imports.Add(new CodeNamespaceImport("System"));

// The stupid codedom API doesn't properly support declarative DOM building, fail.
CodeTypeDeclarationCollection classes = new CodeTypeDeclarationCollection(
	mtom.Select(_ =>
		new CodeTypeDeclaration()
		{
			//Name = _.name,
			Name = _.schemaname,
			IsClass = true,
			IsPartial = true,
		}
	)
	.ToArray());

foreach (CodeTypeDeclaration c in classes)
{
	int count = mtom.Where(_ => _.schemaname == c.Name).Count();
	if (count > 1)
	{
		Console.WriteLine("Ignoring {0} due to duplicality", c.Name);
		continue; // bad one, multiple with same name
	}
	
	c.Members.AddRange(new CodeTypeMember[]
		{
			new CodeMemberProperty() 
			{ 
				Name = mtom.Single(_ => _.schemaname == c.Name).ent1Att,
				Type = new CodeTypeReference(nameSpace + ".Key"),
				HasGet = true,
				HasSet = true,
				Attributes = MemberAttributes.Public,
			},
			new CodeMemberProperty() 
			{ 
				Name = mtom.Single(_ => _.schemaname == c.Name).ent2Att,
				Type = new CodeTypeReference(nameSpace + ".Key"),
				HasGet = true,
				HasSet = true,
				Attributes = MemberAttributes.Public,
				
			}
		}
	);

	// goddam codedom doesn't support automatic properties, 
	// have to add something to getters. fail.
	foreach (CodeMemberProperty p in c.Members)
	{
		// just have it return null
		p.GetStatements.Add(
			new CodeMethodReturnStatement(
				new CodePrimitiveExpression(null)));
	}
}

ns.Types.AddRange(classes);
targetUnit.Namespaces.Add(ns);

CodeDomProvider provider = CodeDomProvider.CreateProvider("CSharp");
CodeGeneratorOptions options = new CodeGeneratorOptions();
options.BracingStyle = "C";
using (StreamWriter sourceWriter = new StreamWriter("ManyToManyClasses.cs"))
{
	provider.GenerateCodeFromCompileUnit(
		targetUnit, sourceWriter, options);
}

Go get the code and give it a whirl. LINQtoCRM still has a few large wharts, the ugliest probably being the very limited selectors permitted. I’ll try to lift the selector implementation from LINQtoSharePoint soon.

Showing maps and borders in Processing

A few days ago, I posted a video showing public procurement expanding geopraphically with the EU enlargements in the ’00s. There weren’t any borders, but you could sort of see the outline of Europe and how the dots spread east with time.

Adding actual borders to the map proved very frustrating. The Geographical Information System (GIS) space seems plagued by obtuse binary formats, stodgy desktop applications and a profusion of coordinate systems. I tried many avenues, one of the more promising being a smoothed map from MapShaper passed through TatukGIS and exported to KML. The coordinates from MapShaper turned out to be incompatible with the latitude/longitude ones I had from the Google Maps web services however.

After much searching, I found a KML-file in the Google Maps group (worldcountriesKML.zip) with the borders of all the worlds countries. It’s not smoothed in any way, so the haggard coastline of a country like Norway looks too thick when zoomed out. The result is better than the original though:

I implemented a simple Processing helper-class to parse and draw the KML. You instantiate it like this: helper = new XMLHelper(new XMLElement(this, "world.kml"));. It has a Init() method that takes a String[] of the countries you need and String denoting the XML path to the line coordinates, e.g. "Polygon/outerBoundaryIs/LinearRing/coordinates". After initialization you can ask it for its maximum and minimum coordinates using min_x, max_x, min_y, max_y. The Draw() method draws the map on the sceen, scaled to fit the size.

class XMLHelper
{
  float coordscale = 1;
  XMLElement borders;
  Line[] lines = new Line[0];
  public float max_y =0, min_y = 70, max_x = 0, min_x = 0;
  
  public XMLHelper(XMLElement xe)
  {
    borders = xe;
  }
  
  public void Init(String[] wantedcountries, String coordpath)
  {
    XMLElement[] filecountries = borders.getChildren("Folder/Placemark");
    for(int i = 0; i < filecountries.length; i++)
    {
      XMLElement c = filecountries[i];

      if(Util.Contains(wantedcountries, c.getChild(0).getContent()))
      {
        // this one should be included on the map
        String points = c.getChild(coordpath).getContent();
      
        String[] point_a = split(points," ");
        Point[] pointarray = new Point[point_a.length];
        for(int j = 0; j < point_a.length; j++)
        {
          String[] xyz = split(point_a[j],',');
          
          float x = float(xyz[0])/coordscale;
          float y = float(xyz[1])/coordscale;
  
          pointarray[j] = new Point(x, y);
  
          max_x = max(x, max_x);
          min_x = min(x, min_x);
          max_y = max(y, max_y);
          min_y = min(y, min_y);
        }
        // this looks slow
        lines = (Line[]) append(lines,new Line(pointarray));
      }
    }
  }
  
  public void Draw()
  {
    for(int i = 0; i < lines.length; i++)
    {
      Line l = lines[i];
      beginShape();
      for(int j = 0; j < l.points.length; j++)
      {
        vertex(
          map(l.points[j].x, min_x, max_x, 0, width),
          height - map(l.points[j].y, min_y, max_y, 0, height)
        );
      }
      endShape();
    }
  }
}

class Point
{
  public float x,y;
  public Point(float _x, float _y)
  {
    x = _x;
    y = _y;
  }
}

class Line
{
  Point[] points;
  public Line(Point[] _points)
  {
    points = _points;
  }
}

static class Util
{
  static boolean Contains(String[] a, String s)
  {
    for(int i = 0; i < a.length; i++)
    {
      if(a[i].toLowerCase().equals(s.toLowerCase()))
      {
        return true;
      }
    }
    return false;
  }
}

Processing and SQL server on Windows

Tim Regan has a comprehensive description of how he got SQL Server 2008 running with processing. I found some steps to be superfluous while others were missing (it was still an invaluable guide though), here’s how I did it:

  1. Get the SQL Server JDBC driver, be sure to get the Windows version
  2. Unpack it somewhere and copy sqljdbc.jar to libraries\sqljdbc\library in your Processing folder
  3. Copy qljdbc_auth.dll from the enu\auth\x86 folder of jdbc to C:\Windows\System32
  4. In your sketch, do Sketch->Import Library->sqljdbc
  5. You can now use the helper class from Tim’s post (which is inspired by the one in Ben Fry’s book)

Note that it is apparantly not necessary to muck around with the CLASSPATH. Happy querying!

Older Posts Newer Posts