### Need to practice creating a dummy matrix and check the size of it
### It needs to be 499085 rows * 22298 columns
### Each cell needs to hold data measuring two 10's and two decimal places (e.g. 10.52)

### Read in a yearly file first and check the size on disk

### Directory of yearly files

tmax.dir = '/home/99/jc152199/MicroclimateStatisticalDownscale/250mASCII/brtpredsfinal/maxgzip'

### List of all yearly files

tfiles = list.files(tmax.dir,pattern='allpos_alldays',full.names=T,recursive=T)

#### Read in one file as a matrix

ymat = as.matrix(read.csv(tfiles[1]))

#### The above line reads in the matrix appropriately with the 'dimnames' set for the column headings, the first 4 rows are row/col positions from the ASCII and lat/long from the ASCII

### Size on disk in Gb

Gb.size = ((as.numeric(object.size(ymat))/1024)/1024)/1024

### This particular file (1950 yearly summary) is 1.37 Gb on disk
### So, the all 60 years should be 82.33 Gb

### Check this by creating a dummy matrix

dummy.uber.mat = matrix(data=25.22,nrow=499085,ncol=22298)

### Login only has 64 Gb of memory, will need to use a compute node
### And need to chop the fucking thing into 10 pieces hahaha


