Pandas and Time based data

import pandas as pd
import datetime as dt
import random
date_range=pd.date_range(start="1-Jan-2014 12:00",end="1-Jan-2015 12:00")
people=['bob','tom','frank','tim']
idx=0
data=[]
for d in date_range:
    r=int(random.random()*10%len(people))
    value=int(random.random()*100)
    print d,people[r],value
    rec=[d,people[r],value]
    data.append(rec)

r =[ a for a in range(365)]
r2 =[ a for a in range(365,0,-1)]
2014-01-01 12:00:00 bob 50
2014-01-02 12:00:00 frank 24
2014-01-03 12:00:00 tom 66
2014-01-04 12:00:00 tim 33
2014-01-05 12:00:00 tom 76
2014-01-06 12:00:00 frank 9
2014-01-07 12:00:00 bob 58
2014-01-08 12:00:00 tom 27
2014-01-09 12:00:00 bob 32
2014-01-10 12:00:00 bob 86
2014-01-11 12:00:00 frank 42
2014-01-12 12:00:00 tim 31
2014-01-13 12:00:00 tom 64
2014-01-14 12:00:00 tom 13
2014-01-15 12:00:00 tom 24
2014-01-16 12:00:00 tom 14
2014-01-17 12:00:00 frank 61
2014-01-18 12:00:00 bob 69
2014-01-19 12:00:00 bob 35
2014-01-20 12:00:00 tom 45
2014-01-21 12:00:00 tim 23
2014-01-22 12:00:00 bob 71
2014-01-23 12:00:00 tim 45
2014-01-24 12:00:00 bob 2
2014-01-25 12:00:00 frank 80
2014-01-26 12:00:00 frank 93
2014-01-27 12:00:00 tom 47
2014-01-28 12:00:00 tim 59
2014-01-29 12:00:00 bob 46
2014-01-30 12:00:00 tim 26
2014-01-31 12:00:00 frank 69
2014-02-01 12:00:00 bob 84
2014-02-02 12:00:00 tim 73
2014-02-03 12:00:00 tom 64
2014-02-04 12:00:00 tim 79
2014-02-05 12:00:00 tim 46
2014-02-06 12:00:00 tom 9
2014-02-07 12:00:00 bob 57
2014-02-08 12:00:00 tom 21
2014-02-09 12:00:00 bob 1
2014-02-10 12:00:00 frank 19
2014-02-11 12:00:00 frank 7
2014-02-12 12:00:00 bob 8
2014-02-13 12:00:00 tom 65
2014-02-14 12:00:00 bob 60
2014-02-15 12:00:00 bob 15
2014-02-16 12:00:00 tim 14
2014-02-17 12:00:00 bob 95
2014-02-18 12:00:00 bob 54
2014-02-19 12:00:00 tom 8
2014-02-20 12:00:00 tim 93
2014-02-21 12:00:00 bob 3
2014-02-22 12:00:00 bob 60
2014-02-23 12:00:00 tom 38
2014-02-24 12:00:00 tim 56
2014-02-25 12:00:00 bob 68
2014-02-26 12:00:00 tom 80
2014-02-27 12:00:00 tom 63
2014-02-28 12:00:00 tim 12
2014-03-01 12:00:00 tim 5
2014-03-02 12:00:00 tim 63
2014-03-03 12:00:00 bob 37
2014-03-04 12:00:00 bob 86
2014-03-05 12:00:00 bob 57
2014-03-06 12:00:00 tom 24
2014-03-07 12:00:00 tim 62
2014-03-08 12:00:00 frank 97
2014-03-09 12:00:00 bob 33
2014-03-10 12:00:00 tom 82
2014-03-11 12:00:00 tom 96
2014-03-12 12:00:00 tim 17
2014-03-13 12:00:00 frank 4
2014-03-14 12:00:00 tim 45
2014-03-15 12:00:00 tom 51
2014-03-16 12:00:00 tom 54
2014-03-17 12:00:00 tom 54
2014-03-18 12:00:00 tim 24
2014-03-19 12:00:00 frank 52
2014-03-20 12:00:00 frank 67
2014-03-21 12:00:00 frank 88
2014-03-22 12:00:00 bob 78
2014-03-23 12:00:00 tom 7
2014-03-24 12:00:00 bob 80
2014-03-25 12:00:00 tim 76
2014-03-26 12:00:00 frank 73
2014-03-27 12:00:00 frank 6
2014-03-28 12:00:00 bob 33
2014-03-29 12:00:00 bob 61
2014-03-30 12:00:00 bob 32
2014-03-31 12:00:00 bob 29
2014-04-01 12:00:00 tim 70
2014-04-02 12:00:00 tim 52
2014-04-03 12:00:00 tom 10
2014-04-04 12:00:00 tim 45
2014-04-05 12:00:00 tom 21
2014-04-06 12:00:00 bob 57
2014-04-07 12:00:00 bob 79
2014-04-08 12:00:00 tom 43
2014-04-09 12:00:00 tom 37
2014-04-10 12:00:00 frank 71
2014-04-11 12:00:00 tom 93
2014-04-12 12:00:00 bob 80
2014-04-13 12:00:00 tim 77
2014-04-14 12:00:00 bob 74
2014-04-15 12:00:00 bob 15
2014-04-16 12:00:00 bob 75
2014-04-17 12:00:00 tom 73
2014-04-18 12:00:00 tom 68
2014-04-19 12:00:00 tim 71
2014-04-20 12:00:00 bob 66
2014-04-21 12:00:00 tim 54
2014-04-22 12:00:00 tom 45
2014-04-23 12:00:00 tom 25
2014-04-24 12:00:00 tom 68
2014-04-25 12:00:00 tim 54
2014-04-26 12:00:00 tom 25
2014-04-27 12:00:00 bob 10
2014-04-28 12:00:00 tom 58
2014-04-29 12:00:00 tim 48
2014-04-30 12:00:00 frank 26
2014-05-01 12:00:00 bob 94
2014-05-02 12:00:00 frank 26
2014-05-03 12:00:00 bob 6
2014-05-04 12:00:00 tom 83
2014-05-05 12:00:00 tom 96
2014-05-06 12:00:00 tom 8
2014-05-07 12:00:00 bob 63
2014-05-08 12:00:00 frank 34
2014-05-09 12:00:00 frank 75
2014-05-10 12:00:00 tim 80
2014-05-11 12:00:00 tim 69
2014-05-12 12:00:00 tim 77
2014-05-13 12:00:00 tom 65
2014-05-14 12:00:00 frank 54
2014-05-15 12:00:00 tim 90
2014-05-16 12:00:00 tom 16
2014-05-17 12:00:00 bob 83
2014-05-18 12:00:00 tom 41
2014-05-19 12:00:00 tim 13
2014-05-20 12:00:00 tom 82
2014-05-21 12:00:00 bob 82
2014-05-22 12:00:00 tom 42
2014-05-23 12:00:00 tom 29
2014-05-24 12:00:00 frank 16
2014-05-25 12:00:00 bob 20
2014-05-26 12:00:00 bob 32
2014-05-27 12:00:00 bob 47
2014-05-28 12:00:00 tom 47
2014-05-29 12:00:00 tom 67
2014-05-30 12:00:00 tom 24
2014-05-31 12:00:00 frank 49
2014-06-01 12:00:00 frank 15
2014-06-02 12:00:00 tom 92
2014-06-03 12:00:00 bob 39
2014-06-04 12:00:00 tom 90
2014-06-05 12:00:00 tim 40
2014-06-06 12:00:00 bob 87
2014-06-07 12:00:00 frank 32
2014-06-08 12:00:00 tom 80
2014-06-09 12:00:00 bob 38
2014-06-10 12:00:00 frank 53
2014-06-11 12:00:00 tom 97
2014-06-12 12:00:00 tom 47
2014-06-13 12:00:00 bob 42
2014-06-14 12:00:00 tom 9
2014-06-15 12:00:00 tom 33
2014-06-16 12:00:00 frank 46
2014-06-17 12:00:00 tom 37
2014-06-18 12:00:00 bob 0
2014-06-19 12:00:00 bob 50
2014-06-20 12:00:00 bob 84
2014-06-21 12:00:00 frank 9
2014-06-22 12:00:00 frank 17
2014-06-23 12:00:00 tim 44
2014-06-24 12:00:00 tom 71
2014-06-25 12:00:00 tom 80
2014-06-26 12:00:00 tom 97
2014-06-27 12:00:00 tom 40
2014-06-28 12:00:00 tim 33
2014-06-29 12:00:00 tom 75
2014-06-30 12:00:00 tom 8
2014-07-01 12:00:00 tom 1
2014-07-02 12:00:00 bob 80
2014-07-03 12:00:00 bob 47
2014-07-04 12:00:00 tim 89
2014-07-05 12:00:00 tom 44
2014-07-06 12:00:00 tim 69
2014-07-07 12:00:00 tim 68
2014-07-08 12:00:00 bob 14
2014-07-09 12:00:00 bob 99
2014-07-10 12:00:00 tom 83
2014-07-11 12:00:00 bob 35
2014-07-12 12:00:00 bob 70
2014-07-13 12:00:00 tim 92
2014-07-14 12:00:00 tom 79
2014-07-15 12:00:00 tom 88
2014-07-16 12:00:00 bob 14
2014-07-17 12:00:00 tim 13
2014-07-18 12:00:00 bob 80
2014-07-19 12:00:00 tim 5
2014-07-20 12:00:00 tim 99
2014-07-21 12:00:00 tom 64
2014-07-22 12:00:00 tom 65
2014-07-23 12:00:00 bob 2
2014-07-24 12:00:00 tim 4
2014-07-25 12:00:00 bob 12
2014-07-26 12:00:00 bob 71
2014-07-27 12:00:00 frank 81
2014-07-28 12:00:00 tim 5
2014-07-29 12:00:00 tom 59
2014-07-30 12:00:00 frank 5
2014-07-31 12:00:00 frank 47
2014-08-01 12:00:00 tim 87
2014-08-02 12:00:00 bob 8
2014-08-03 12:00:00 bob 12
2014-08-04 12:00:00 tom 95
2014-08-05 12:00:00 frank 15
2014-08-06 12:00:00 tom 50
2014-08-07 12:00:00 frank 34
2014-08-08 12:00:00 bob 90
2014-08-09 12:00:00 frank 33
2014-08-10 12:00:00 tom 95
2014-08-11 12:00:00 tom 83
2014-08-12 12:00:00 tim 55
2014-08-13 12:00:00 frank 91
2014-08-14 12:00:00 bob 67
2014-08-15 12:00:00 bob 81
2014-08-16 12:00:00 frank 87
2014-08-17 12:00:00 bob 0
2014-08-18 12:00:00 tom 70
2014-08-19 12:00:00 frank 92
2014-08-20 12:00:00 bob 82
2014-08-21 12:00:00 tim 92
2014-08-22 12:00:00 tom 64
2014-08-23 12:00:00 frank 54
2014-08-24 12:00:00 tim 58
2014-08-25 12:00:00 tom 11
2014-08-26 12:00:00 bob 98
2014-08-27 12:00:00 tom 9
2014-08-28 12:00:00 frank 7
2014-08-29 12:00:00 tim 39
2014-08-30 12:00:00 tim 64
2014-08-31 12:00:00 frank 7
2014-09-01 12:00:00 bob 25
2014-09-02 12:00:00 bob 92
2014-09-03 12:00:00 frank 33
2014-09-04 12:00:00 tim 56
2014-09-05 12:00:00 tom 46
2014-09-06 12:00:00 frank 98
2014-09-07 12:00:00 tom 26
2014-09-08 12:00:00 frank 62
2014-09-09 12:00:00 frank 66
2014-09-10 12:00:00 bob 58
2014-09-11 12:00:00 bob 80
2014-09-12 12:00:00 tom 9
2014-09-13 12:00:00 tom 93
2014-09-14 12:00:00 tim 21
2014-09-15 12:00:00 bob 98
2014-09-16 12:00:00 frank 21
2014-09-17 12:00:00 tom 7
2014-09-18 12:00:00 tom 69
2014-09-19 12:00:00 tom 90
2014-09-20 12:00:00 frank 98
2014-09-21 12:00:00 tim 88
2014-09-22 12:00:00 tim 22
2014-09-23 12:00:00 frank 26
2014-09-24 12:00:00 frank 28
2014-09-25 12:00:00 bob 35
2014-09-26 12:00:00 tom 1
2014-09-27 12:00:00 tom 55
2014-09-28 12:00:00 bob 90
2014-09-29 12:00:00 frank 68
2014-09-30 12:00:00 tim 45
2014-10-01 12:00:00 tom 28
2014-10-02 12:00:00 tom 96
2014-10-03 12:00:00 tom 21
2014-10-04 12:00:00 tim 1
2014-10-05 12:00:00 frank 78
2014-10-06 12:00:00 tim 22
2014-10-07 12:00:00 bob 16
2014-10-08 12:00:00 tom 57
2014-10-09 12:00:00 tom 57
2014-10-10 12:00:00 tom 82
2014-10-11 12:00:00 bob 1
2014-10-12 12:00:00 tim 86
2014-10-13 12:00:00 tom 68
2014-10-14 12:00:00 bob 26
2014-10-15 12:00:00 tom 56
2014-10-16 12:00:00 bob 10
2014-10-17 12:00:00 tim 29
2014-10-18 12:00:00 tom 64
2014-10-19 12:00:00 tim 0
2014-10-20 12:00:00 tom 88
2014-10-21 12:00:00 bob 48
2014-10-22 12:00:00 tom 30
2014-10-23 12:00:00 bob 75
2014-10-24 12:00:00 tom 84
2014-10-25 12:00:00 tom 95
2014-10-26 12:00:00 tom 6
2014-10-27 12:00:00 tom 5
2014-10-28 12:00:00 tom 32
2014-10-29 12:00:00 frank 37
2014-10-30 12:00:00 bob 13
2014-10-31 12:00:00 bob 47
2014-11-01 12:00:00 tom 33
2014-11-02 12:00:00 bob 90
2014-11-03 12:00:00 tom 92
2014-11-04 12:00:00 bob 5
2014-11-05 12:00:00 bob 79
2014-11-06 12:00:00 tom 12
2014-11-07 12:00:00 frank 31
2014-11-08 12:00:00 bob 35
2014-11-09 12:00:00 tom 59
2014-11-10 12:00:00 tom 88
2014-11-11 12:00:00 bob 50
2014-11-12 12:00:00 bob 3
2014-11-13 12:00:00 bob 55
2014-11-14 12:00:00 frank 20
2014-11-15 12:00:00 tom 91
2014-11-16 12:00:00 bob 6
2014-11-17 12:00:00 tom 22
2014-11-18 12:00:00 bob 57
2014-11-19 12:00:00 tom 63
2014-11-20 12:00:00 frank 20
2014-11-21 12:00:00 tim 9
2014-11-22 12:00:00 bob 72
2014-11-23 12:00:00 bob 87
2014-11-24 12:00:00 tom 65
2014-11-25 12:00:00 tim 92
2014-11-26 12:00:00 bob 73
2014-11-27 12:00:00 tom 38
2014-11-28 12:00:00 tom 61
2014-11-29 12:00:00 bob 88
2014-11-30 12:00:00 tom 61
2014-12-01 12:00:00 tom 69
2014-12-02 12:00:00 bob 24
2014-12-03 12:00:00 tom 46
2014-12-04 12:00:00 tom 71
2014-12-05 12:00:00 frank 41
2014-12-06 12:00:00 bob 49
2014-12-07 12:00:00 tom 34
2014-12-08 12:00:00 tom 36
2014-12-09 12:00:00 frank 69
2014-12-10 12:00:00 bob 6
2014-12-11 12:00:00 bob 61
2014-12-12 12:00:00 tom 2
2014-12-13 12:00:00 bob 39
2014-12-14 12:00:00 frank 52
2014-12-15 12:00:00 bob 54
2014-12-16 12:00:00 bob 89
2014-12-17 12:00:00 frank 33
2014-12-18 12:00:00 bob 41
2014-12-19 12:00:00 frank 24
2014-12-20 12:00:00 tom 8
2014-12-21 12:00:00 bob 49
2014-12-22 12:00:00 tom 51
2014-12-23 12:00:00 tom 41
2014-12-24 12:00:00 bob 87
2014-12-25 12:00:00 tom 55
2014-12-26 12:00:00 bob 18
2014-12-27 12:00:00 tom 88
2014-12-28 12:00:00 bob 8
2014-12-29 12:00:00 frank 20
2014-12-30 12:00:00 bob 51
2014-12-31 12:00:00 bob 95
2015-01-01 12:00:00 bob 1
[[Timestamp('2014-12-23 12:00:00', offset='D'), 'tom', 41],
 [Timestamp('2014-12-24 12:00:00', offset='D'), 'bob', 87],
 [Timestamp('2014-12-25 12:00:00', offset='D'), 'tom', 55],
 [Timestamp('2014-12-26 12:00:00', offset='D'), 'bob', 18],
 [Timestamp('2014-12-27 12:00:00', offset='D'), 'tom', 88],
 [Timestamp('2014-12-28 12:00:00', offset='D'), 'bob', 8],
 [Timestamp('2014-12-29 12:00:00', offset='D'), 'frank', 20],
 [Timestamp('2014-12-30 12:00:00', offset='D'), 'bob', 51],
 [Timestamp('2014-12-31 12:00:00', offset='D'), 'bob', 95],
 [Timestamp('2015-01-01 12:00:00', offset='D'), 'bob', 1]]
df = pd.DataFrame(data,columns=['period','word','freq'])
df.tail()

#df = pd.DataFrame({'r': r, 'r2':r2},index=pd.date_range('20130101',periods=365)

top_user = df.groupby('word').freq.sum()
top_user.head()
period word freq
361 2014-12-28 12:00:00 bob 8
362 2014-12-29 12:00:00 frank 20
363 2014-12-30 12:00:00 bob 51
364 2014-12-31 12:00:00 bob 95
365 2015-01-01 12:00:00 bob 1
top_user.sort(['word'],ascending=False)
top_user[:2]
word
tom     6525
bob     5751
Name: freq, dtype: int64
df.groupby(['period','word']).sum().head(10)
freq
period word
2014-01-01 12:00:00 bob 50
2014-01-02 12:00:00 frank 24
2014-01-03 12:00:00 tom 66
2014-01-04 12:00:00 tim 33
2014-01-05 12:00:00 tom 76
2014-01-06 12:00:00 frank 9
2014-01-07 12:00:00 bob 58
2014-01-08 12:00:00 tom 27
2014-01-09 12:00:00 bob 32
2014-01-10 12:00:00 bob 86
df['period'] = df['period'].astype('datetime64[ns]')
df.head()
period word freq
0 2014-01-01 12:00:00 bob 50
1 2014-01-02 12:00:00 frank 24
2 2014-01-03 12:00:00 tom 66
3 2014-01-04 12:00:00 tim 33
4 2014-01-05 12:00:00 tom 76

Panda Set Values and Group-By

#
# Filter Hours < 0 to be 0
# Get the sum of the Hours per Country
#

import pandas as pd
Line1 = {"Country": "USA", "Date":"01 jan", "Hours":4}
Line2 = {"Country": "USA", "Date":"01 jan", "Hours":3}
Line3 = {"Country": "USA", "Date":"01 jan", "Hours":-999}
Line4 = {"Country": "Japan", "Date":"01 jan", "Hours":3}
df=pd.DataFrame([Line1,Line2,Line3,Line4])
df
#Set a value using a index....
#
# df['Hours' ]< 0 gives an Index & True/False - a "mask"
# so....
# dataframe.FIELD [ "mask" ] = Value
df.Hours[ df['Hours' ]< 0 ] =0

#Show what the data frame now looks like
df
Country Date Hours
0 USA 01 jan 4
1 USA 01 jan 3
2 USA 01 jan 0
3 Japan 01 jan 3

4 rows × 3 columns

#
# Now group by
hr=df.groupby(['Country','Date']).Hours.sum()
hr.head()
Country  Date
Japan    01 jan    3
USA      01 jan    7
Name: Hours, dtype: int64

I hope this gave you some idea of how you can progress with Pandas.

Pandas - with Python

I have been using or rather trying to use R for my statistical work - but because I spend 90% of my time in other languages I find the way or R thinking somewhat confusing.

I however spend 60% of my time using Python.... which I think is one of the nicests languages I have used to date (and the list is quite long).

Getting Started with Pandas

I think for a beginner - you would be best served by using iPython and the excellent notebook facility. It makes things much easier.

I would also add that I am doing this with iPython3 - which has a nicer notebook interface than iPython (2.7)

My final recommendation (and we have not got to Pandas yet !!) is to install virtualenv ... oh an git .....

This walk-through however does not assume that you have these products.

Install Pandas

Several ways you can do this - using your OS repository command, or by using pip or finally virtualenv it.

This is the rough guide to all those ways.

sudo apt-get install python-pandas
sudo pip install pandas
   OR
mkdir pandas_test
cd pandas_test
virtualenv test1
source test1/bin/active
pip install pandas

CSV Time

Pandas helps you crunch data.... so lets create some data.

#!/usr/bin/python3
import random
import datetime
secs=1400000000
print ("when,word,count")
words=['bill','tom','frank']
for n in range(1,365):
 for h in range(0,24):
  reftime=datetime.datetime.fromtimestamp(secs+(3600*n*h))
  for w in words:
    line="{0} ,{1},{2} ".format(
    str(reftime.strftime("%Y-%m-%d %H:%M")), w,int(random.random()*20))
    print ("%s"%(line))

To run this - simple type

python3 makedata.py > timeline.csv

Get the CSV data into pandas

From now on I am assuming you are using iPython/idle ... etc

import pandas
recs=pd.read_csv('timeline.csv',
names=['when','word','cnt'],
parse_dates={'datetime':['when']},
keep_date_col = True,
index_col='datetime')

And we should have read in the data from the csv file.

Check what columns we have

recs.columns.get_values()

and something like this should be displayed

array(['when', 'word', 'cnt'], dtype=object)

We can then look at the dataframe (recs) by just typing recs

recs
datetime
when    when    word    count
2014-05-13 20:53        2014-05-13 20:53        bill    0
2014-05-13 20:53        2014-05-13 20:53        tom     18
2014-05-13 20:53        2014-05-13 20:53        frank   19

We have our data ... now what ?

This is often the point that the Programmer/Analyst starts to give up - and the Data Scientist starts to get excited.

  • We have data
  • It appears to all have been imported
  • What can we learn from this ?

Filter all the records by word

Instead of seeing all the records, I want to see the Bill records

Logically you can see this doing

recs['word']=='bill'

And this will show something like

2014-05-13 20:53      True
2014-05-13 20:53     False
2014-05-13 20:53     False
2014-05-13 21:53      True
2014-05-13 21:53     False
2014-05-13 21:53     False
2014-05-13 22:53      True

But you probably are not interested in the data from a logical point of view - you want only to see Bills records.

recs[recs['word'].isin(['bill'])]
                        when    word    cnt
datetime
2014-05-13 20:53        2014-05-13 20:53        bill    0
2014-05-13 21:53        2014-05-13 21:53        bill    6
2014-05-13 22:53        2014-05-13 22:53        bill    19
2014-05-13 23:53        2014-05-13 23:53        bill    16
2014-05-14 00:53        2014-05-14 00:53        bill    17
2014-05-14 01:53        2014-05-14 01:53        bill    13

This can be refined again - just to display the '''cnt''' column as

recs[recs['word'].isin(['bill'])]['cnt']

Pandas a beginning

from pandas import *
df = DataFrame({'key1' : ['a', 'a', 'b', 'b', 'a'],
'key2' : ['one', 'two', 'one', 'two', 'one'],
'data1' : np.random.randn(5),
'data2' : np.random.randn(5)})
df
data1 data2 key1 key2
0 0.012728 1.071175 a one
1 0.152258 0.246503 a two
2 -0.551473 1.101130 b one
3 0.392722 -0.919443 b two
4 -0.504487 0.385234 a one
grouped = df['data1'].groupby(df['key1'])
grouped.mean()
key1
a      -0.113167
b      -0.079376
Name: data1, dtype: float64

Better way to build a Dictionary

As I was building dictionaries for the past few days (yes very exciting I agree), I have learnt lots about the various ways to do this.

In the past I would have build a dictionary (say for unique occurances of an item - OK I know I can do this using set etc.... but please humour me this is a dictionary example).

for a in list_of_words:
    if unique[a] exists:
       pass
    else:
       unique[a]=1

Which is some ways is quite nice - however

for a in list_of_words:
    unique[a] = unique.get(a,0) +1

I think is cleaner - and more elegant. Plus you get to record the number of occurances of the item.

Sorting Dictionaries by value

I had to so some dictionary work today whilst in the coal-mine.... All was going well until I tried to sort by Value.

#!/usr/bin/python
from collections import OrderedDict
from collections import Counter
words={}
line="this is a line and a very fine line it is line up over there"
for w in line.split(' '):
  if w in words:
    words[w] += 1
  else:
    words[w] = 1
#print words
#
#Now Sort this
#
d_sorted_by_value = OrderedDict(sorted(words.items(), key=lambda x: x[1]))
mc=Counter(d_sorted_by_value).most_common()[0:2]
#print d_sorted_by_value
print mc

Juliets birthday

Start

Well it has been 40 something years in the planning - but today a certail someone hit the mid point between someplace and somewhere .....

We went to church in the morning !!! Yes it is still standing, and then afterwards we chilled out at home. For lunch we met our good friends Duncan and Sheila for a smashing fish'n'chip lunch. Afterwards we retired to 'chez nous' for some light refreshments.

/galleries/juliet45/j&t.jpg/galleries/juliet45/j&t2.jpg/galleries/juliet45/J.jpg

Lunch

Whilst this is not a "posh" place - the food is nice, and the staff/company was excellent.

/galleries/juliet45/flamingprawns.jpg/galleries/juliet45/flamingprawns2.jpg/galleries/juliet45/flamingprawns3.jpg

Flaming food demo !!!

Off Home

A lovely day- with the rain and wind as the night drew in - even better.

/galleries/juliet45/j&t_car.jpg
  • there are some more pics - plus this available at Birthday

House plans

For a while we have been thinking about building a house on some land we own back in the Philippines. Instead of just talking about it - we went to see a recommended architect and passed over our plans - to see what they could come up with.

The results were quite impressive.

Overview

House plan possibly

Ground Floor

The ground floor may look like this

House plan possibly

First Floor

The Second (First in English !!) floor may look like this

House plan possibly

Second Floor

The Third Floor (Seconds in English !!) floor may look like this

House plan possibly