The way most old computer languages did it was to encode the date as the number of days since January 1, 0001. You can call it a 'Rata Die' or 'proleptic Gregorgian date' if you want to sound fancy.
But just doing a sanity check says that 2013 x 364.25 = 735,248 days. So you can't encode that in even an unsigned 16-bit integer.
So you will have to use a 32 bit double integer, or do a different sort of encoding that doesn't adhere to a standard. My first instinct would be to use the count of days from the start of the Unix Epoch on January 1, 1970.
The calculation of the 'day of the year', also called the 'ordinal day' or 'Julian day', has been discussed on the Forum in the past. You can search this Forum, or go to the Rockwell Automation Sample Code website and search for 'julian' to find a structured text routine that calculates the ordinal day of the year.