Function Get-CosmData {
<#       
.SYNOPSIS
	Retrieve data from Cosm for a specified feed and for a specified period.
.DESCRIPTION
	The primary purpose is to allow "logical" requests to be made without consideration of the limits placed by Cosm on
	the extent of data that may be retrieved in one request. 
	An input data specification is used to format an API query to Cosm, using a Web Client to return the 
	requested data. It is then formatted to a standard format and sent down the powershell pipeline.
	If the optional parameter "specification" is received from the pipeline, multiple requests may be 
	issued and combined further down the pipeline. 
	With multiple reqests, CSV output format is the same as "listing a feed" i.e. the feedid is included 
	in the output. For XML Separate eeml documents are sent down the pipeline and would need to be separately
	combined.
.PARAMETER specification (optional)
	a [hashtable] of parameters being name-value pairs which defines the data to be extracted and which may be received from the pipeline. 
	The names are a subset of the commandline parameter (abbreviated if required) and the values should follow the same constraints. 
	When parameters are provided on the command line they will apply to all requests for a single execution of Get-CosmData unless
	over-ridden by a parameter received in specification. Parameters provided in specificaton do not carry-over to subsequent extracts. To negate a
	commandline parameter without replacement set it to $null.
.PARAMETER apikey
	a [string] containing a Cosm APIkey which allows read access to the required feed and datastreams. 
	If no APIKey is entered on the command line, the User's profile will be examined for a Cosm_APIKey from a previous execution. 
	The APIkey from the commandline will be securely stored in the users profile. TBA
	An APIKey received from specification will temporarily over-ride the the commmandline APIKey but will not be stored in the profile.
.PARAMETER feed
	A [string] containing the Cosm FeedId (aka Environment) that contains the required data
.PARAMETER datastreams
	an array of strings or a [string] containing a comma separated list of the datastream names to be extracted
	If all datastreams are required omit this parameter.
.PARAMETER format
	A [string] containing the type of data to be returned. Valid values are 'csv', 'xml' (, 'json' tba) . 
	If not specified defaults to csv (not json as per Cosm)
	For format type 'xml' output is an xml file in eeml format containing the datapoints for the time period specified as well
	as the current value 
	For output type 'csv' a data stream of datapoints one per line returned through the powershell pipeline. Each datapoint is a csv string
	containing feed,datastream,timestamp,value 
	OUTPUTVALUE feed
		A [string] containing the FeedId
	OUTPUTVALUE datastream
		A [string] containing the Datastream that contained the value
	OUTPUTVALUE timestamp
		A [datetime] being the timestamp applying to the value.
	OUTPUTVALUE value
.PARAMETER start 
	a [DateTime] object, or a [string] representing a date and time, which is the earliest data to be retrieved from Cosm, 
	If the date is not in a form that can be interpreted as an ISO 8601 date with timezone, it will be interpreted as local time on the PC .
.PARAMETER end
	a [DateTime] object, or a [string] representing a date and time, which is the latest data to be retrieved from Cosm, 
	If the date is not in a form that can be intrerpreted as an ISO 8601 date with timezone, it will be interpreted as local time on the PC .
.PARAMETER duration
	A duration parameter is either a [timespan], or as a string conforming to the Cosm duration requirement (like "6hours").
	Valid units are seconds,minutes,hours,days,months,years or the singular form.
.PARAMETER interval
	An [integer] number representing the sampling interval in seconds for the Cosm Data to be extracted. 
	The last raw datapoint in a sample timeslot is returned. The timeslots are currently aligned to UTC i.e.
	an interval of 1440  returns the last raw datapoint in a (UTC) day. 3600 returns the last raw datapoint in
	an Hour. start and end have no effect on the sample timeslots.
	If there is no raw datapoint in the timeslot then nothing is returned.
	Allowable values are 0,300,600, etc. '0' is used for raw data ie alll of the original input data.
	[note that for interval=0 Cosm will severely limit the duration (to about 6 hours?)]
.PARAMETER per_page
	An [integer] representing the maximum number of data points that may be retrieved in a single execution. 
	If the number of datatapoints required to satisfy the specification is greater than per_page, 
	the remainder may be obtained by re-issuing the request setting an additional parameter 'page' equal to 2 --> n. 
	See also parameter max_page
.PARAMETER page
	An [integer] representing the number of the page to be returned.
	When an extract returns more datapoints than the per_page limit the data may be retrieved in multiple pages 
	but only the first per_page datapoints are returned unless the page parameter is entered or the max_page parameter is entered.  
	See also defintition of per_page and max_page.
.PARAMETER max_page
	An [integer] representing the maximum number of the pages to be returned.
	Defaults to 1 and is ignored if page is not null.
	If max_page is greater than 1 the request will be reissued with page=1 --> page=max_page 
	Caution: Do NOT set max_page arbitarily high as mistakes could cause severe overloading of the Cosm servers
.PARAMETER find_previous
	A [string] containing the word "true" will cause the datapoint immediately before the start to be retrieved 
	(in addition to the datapoints specified). Has no effect when "interval_type=discrete" is specified
.PARAMETER interval_type
	A [string] containing the word "discrete" will cause the samples to be timed at the boundary of the 
	sample interval. So for a sample interval of 3600, datapoints will be produced at 00:00:00, 01:00:00 etc 
	and will have the values current at that time. This may mean that the same raw datapoint is repeated a number of times
.PARAMETER time
	NB. Time is now deprecated in Cosm.
	a [DateTime] object, or a [string] representing a date and time, for which the current value is required. 
	For output in xml or json it is returned as Current_value not as a datapoint, for csv it will be returned as a datapoint
	If the string is not in a form that can be interpreted as an ISO 8601 date, it will be interpreted as local time on the PC .
.PARAMETER timezone
	A [string] containing the timezone in which the output should be produced. This may be a city name 
	or a time offset in hours to 1 decimal point (-ve for West longitude +ve for East Longitude).
	http://Cosm.com/docs/v2/#displaying-times-in-different-zones-via-the-api contais a list of place names that can be used.
	note that currently plce names are not validated but are case-sensitive.
.EXAMPLE
	$String = Get-CosmData -sp @{feed="40873"; datastreams="1,2,6"; apikey="1234567890"; interval="300"; timezone="Melbourne"} -st $FromTimestamp 
	# returns all data in a string variable
.EXAMPLE
	$Spec=@{}
	$Spec.feed="40873"
	$Spec["datastreams"]="1,2,6"
	$spec.apikey="1234567890"
	$spec.interval=300
	$spec.timezone="+11"
	Get-CosmData -sp $spec -st $FromTimestamp | Next_Function # passes data in Powershell pipeline
.EXAMPLE
	@{feed=40873;datastreams="1,2"},@{feed=5378;datastreams=1} | Get-CosmData -dur "1day" -interval 300
	# returns data from 2 different feeds for the same time period (last 24 hours of data)
.NOTES 1
	if a duration is not ommitted either start or end must be omitted. if both ommitted, 
	end defaults to current time. 	The undefined one will be calculated from "duration=end-start"
	If specified, start and end are included in the period extracted. 
.NOTES 2
	The Maximum Cosm history duration for  given update interval is given by the following table. 
	This function will reject the data request that exceeds the allowed amount, since Cosm won't let it start.
		___Interval___Duration___
		| 0			| 6hours	|
		| 30		| 12hours	|
		| 60		| 24hours	|
		| 300		| 5days		|
		| 900		| 14days	|
		| 3600		| 31days	|
		| 10800		| 90days	|
		| 21600		| 180days	|
		| 43200		| 1year 	|
		| 86400		| 1year	 	|
.NOTES 3
	There are 2 situations where multiple pages are needed to fulfil a request. 
	the first is where the duration is less than the max permitted duration but the interval will
	return more datapoints than the per-page limit. Even for the maximum per_page (1000 datapoints) 
	this happens for intervals of 900 or less. Especially for interval=0 (raw datapoints), 
	thousands of datapoints are possible in the 6hours duration and only per_page datapoints 
	will be returned for each extract. One way of getting the missing datapoints is to re-issue 
	the request multiple times with page =1,2,..n. This is not used by the automated process employed, 
	but may be individually specified by the user.
	The second situation is where the requested duration is greater than the maximum permitted duration.
	To return the requested time range multiple requests are needed, and the start and end points must 
	be adjusted for each subsequent request to remain in the legal range.
	The automated process which is invoked by setting max_pages>1 handles both situations in the following manner. 
	An extract is made and the returned data examined to see if it has completed the request. 
	If not, and provided the max_pages has not been exceeded, another extract is made starting at 
	the datapoint last received.
.NOTES 4
	The per_page limit is absolute, e.g. If you have 3 datastreams and per_page is 100 you will retrieve 
	33 datapoints for each datastream and 34 for the first. If you are re-submitting a request to pick up 
	the following data be sure to start with the timestamp of the last datapoint returned. You will have 
	some duplicates but won't lose any.
#>  
	param(
		 [Parameter(Mandatory=$false)][string]$apikey=$null
		,[Parameter(Mandatory=$false,Position=0)][string]$feed=$null
		,[Parameter(Mandatory=$false,Position=1)]$datastreams=$null
		,[Parameter(Mandatory=$false,Position=2)]$start=$null
		,[Parameter(Mandatory=$false,Position=3)]$end=$null
		,[Parameter(Mandatory=$false,Position=4)]$duration=$null
		,[Parameter(Mandatory=$false,Position=5)][int]$interval=0
		,[Parameter(Mandatory=$false,Position=6)][ValidateSet("csv","xml","json")]$format="csv"
		,[Parameter(Mandatory=$false)][ValidateSet("true",$null)]$find_previous=$null
		,[Parameter(Mandatory=$false)][ValidateSet("discrete",$null)]$interval_type=$null
		,[Parameter(Mandatory=$false)][ValidateRange(1,1000)][int]$per_page=1000
		,[Parameter(Mandatory=$false)]$page=$null
		,[Parameter(Mandatory=$false)]$max_page=1
		,[Parameter(Mandatory=$false)]$time=$null
		,[Parameter(Mandatory=$false)]$timezone=$null
		,[Parameter(ValuefromPipeline=$True,Mandatory=$false)]$specification=@{}
	)
	Begin {
		Write-Verbose "$($MyInvocation.MyCommand.Name):: Function started" 
		
		#Define a list of parameters that can be included in specification and the minimum length of abbreviation
		# format, find_previous, interval_type, per_page, timezone are omitted from specification as they affect output in a way which makes it difficult to combine later
		$specparms=@{"apikey"=1;"feed"=2;"datastreams"=2;"start"=1;"end"=1;"duration"=2;"interval"=1;"page"=1;"max_page"=1;"time"=1}
		
		# setup a regular expression for matching parameters that are allowed in Specification when they are abbreviated using commandline abbreviation rules
		$RegExp=MakeRegExp $specparms 
		#PN: When the content of the regex is stable $RegExp can be hard coded  from the results of running MakeRegExp once E.g.
#		$RegExp="\A(?<start>s(?:t(?:a(?:r(?:t)?)?)?)?)|(?<page>p(?:a(?:g(?:e)?)?)?)|(?<feed>fe(?:e(?:d)?)?)" `
#		+ "|(?<apikey>a(?:p(?:i(?:k(?:e(?:y)?)?)?)?)?)|(?<duration>du(?:r(?:a(?:t(?:i(?:o(?:n)?)?)?)?)?)?)" `
#		+ "|(?<end>e(?:n(?:d)?)?)|(?<max_page>m(?:a(?:x(?:_(?:p(?:a(?:g(?:e)?)?)?)?)?)?)?)" `
#		+ "|(?<time>t(?:i(?:m(?:e)?)?)?)|(?<interval>i(?:n(?:t(?:e(?:r(?:v(?:a(?:l)?)?)?)?)?)?)?)" `
#		+ "|(?<datastreams>da(?:t(?:a(?:s(?:t(?:r(?:e(?:a(?:m(?:s)?)?)?)?)?)?)?)?)?)\Z"
		
		#save the values of command line parameters that coud be overridden from specification
		$CL=@{}
		foreach ($key in $specparms.keys ) {	
			$x=get-Variable -name $key -valueonly ;	
			If ( $x-ne $null) {
				$CL.$key = $x
			} 	
		}
#		 If ($apikey -ne $null){
#			 if (Test-path $profile) {
#				 $profile:$CosmAPIKey=$apikey
#			 }
#			 else {
#				
#			 }
#		 }

		#------------------------
		# Create Web Client 
		#------------------------	
		$pbQueryHT = new-object System.Collections.Specialized.NameValueCollection
		$pbWC = WebClient -baseaddress "http://api.Cosm.com/v2/feeds/"
		$pbWC.Headers.Add("X-ApiKey",$APIKey)
	}
	
	Process {	
		Write-Verbose "$($MyInvocation.MyCommand.Name):: Process started"
 
		Try { # Retrieve parameters	
			#------------------------------------
			#restore parameters from CommandLine (in case this is not the first iteration from the pipeline)
			#------------------------------------
			foreach ($key in $CL.keys)	{#get parm entered on commandline
				if ($key -match $regexp) {#see if it is allowed in specification
				Set-Variable -name $matches.$key -Value  ($CL[$key])
				}
				else {write-host "1Unrecognised Command Line parameter $key=($CL[$key])"}
			}
			#------------------------------------
			#unpack parameters from specification
			#------------------------------------
			foreach ($key in $specification.keys)	{
				if ("$key" -match $regexp) {
				[array]$MatchNames=$matches.keys #PN: doing this cos we cant index into $matches.keys
				$MatchName=$MatchNames[0] #the non-abbreviated name
				Set-Variable -name $MatchName -Value  ($specification[$key])
				}
				else {write-host "2Unrecognised Specification parameter $key=($specification[$key])"}
			}
		} # Retrieve parameters
		Catch { # Report Parameter Restoration Error
			write-Host "problem retrieving parameters  $_ "
		} # Report Parameter Restoration Error
		
		Try { # Get Cosm Data for current pipeline item
			
			Try { #validate Parameters and obtain Cosm defaults
				#--------------------------------
				# Non-validated parameters - feed, format, find_previous, per_page, interval_type, apikey
				#--------------------------------
				#PN: prefix pb is used for parameters formatted to be sent to Cosm
				$pbFeedId=$feed
				$pbFormat=$format
				$pbFindPrevious=$find_previous
				$pbPerPage=$per_page
				$pbIntervalType=$interval_type
				$pbAPIKey=$apikey
				
				#--------------------------------
				# Validate parameters
				#--------------------------------
				# interval
				[int]$pbExtractIntervalINT = Validate-interval $interval

				# interval and duration combination
				$pbDurationTS,[timespan]$DurationMaxTS,$DurationMaxSTR=$(validate-duration $duration $pbExtractIntervalINT)[0,1,2]
				
				# start, end and duration combination
				[Datetime]$pbRequestStartUtcDT,[Datetime]$pbRequestEndUtcDT,[Timespan]$pbRequestDurationTS=$(Validate-TimePeriod $start $end $pbDurationTS)[0,1,2]
				$pbExtractDurationTS=$pbRequestDurationTS
				$pbExtractDurationSTR=""
				$pbExtractStartUtcDT=$pbRequestStartUtcDT 
				$pbExtractEndUtcDT=$pbRequestEndUtcDT 						

				# timezone
				$pbTimezoneSTR=validate-timezone $timezone
				write-Debug "pbTimezoneSTR=$pbTimezoneSTR"
				
				# page and max_page
				if ($page -ne $null ) {# page parameter entered
					if ($page -is [int]) {#valid page parameter received
						[int]$pbPage=$page
						if ($max_page -ne $null) { # MaxPage ignored
							write-Host "parameter max_page ignored since page is specified"
							[int]$pbMaxPage=$page
						} # Maxpage ignored
					}#valid page parameter received
					else {#page is not valid allow max_page to prevail
						write-Host "parameter page must be specified as integer. Set to 1"
						[int]$pbPage=1						
						if ($max_page -eq $null) {$pbMaxPage=1}
						else{[int]$pbMaxPage=$max_page}
					}#page is not valid allow max_page to prevail
				}# page parameter entered
				else {# page is null
					[int]$pbPage=0
					if ($max_page -eq $null) {$pbMaxPage=1}
					else{[int]$pbMaxPage=$max_page}
				}# page is null

				# Dont allow Excessive duration request, it will be refused by Cosm
				if ($pbRequestDurationTS -gt $DurationMaxTS) { # Requested duration exceeds max allowed
					if ($max_page -eq $pbPage) { #multiple pages not requested
						Throw "Requested duration $($pbRequestDurationTS.totalhours)hours is greater than the Maximum allowed $($DurationMaxTS.totalhours)hours`n `
						Specify a shorter Duration, Or a longer Interval than $interval or set max_pages > 1"
					} #multiple pages not requested
					else { #multiple pages are allowed
						#PN; multiple pages are called by sucessively change the ExtractStartDT
						$pbExtractDurationTS=$DurationMaxTS
						$pbExtractDurationSTR=""
						$pbExtractEndUtcDT=$pbRequestStartUtcDT + $pbExtractDurationTS
						Write-Host "Extract End Date reduced to $pbExtractEndUtcDT as duration exceeded the maximum permitted of $DurationMaxSTR `n" 	`
						+ "'t Up to $max_page multiple extracts will be performed to satisfy the request"
					} #multiple pages are allowed
				} # Requested duration exceeds max allowed
				else {
					if ($max_page -eq $pbPage) { # Single page request
						#warn that multiple pages may be required to fulfill
					} # single page request
					else {
						$pbMaxPage=$max_page
					}
				}
				

				#---------------------------------------------------- 
				# Formulate URI string
				#----------------------------------------------------
				if ($datastreams -eq $null) {
					$local:pbUriSTR =[string]::concat( $pbFeedId, ".", $pbFormat)       
					$pbDatastreamId=$null
				} #omit datastreams from query; equivalent to all datastreams
				elseif ($datastreams -is [array]) {		
					$local:pbUriSTR =[string]::concat( $pbFeedId, ".", $pbFormat)       
					$pbDatastreamId=$datastreams -join ","
				} #send as query parameter
				else {
					$pbDatastreamId=$null
					$local:pbUriSTR =[string]::concat( $pbFeedId,"/datastreams/",$datastreams, ".", $pbFormat)       
				} #include single datastream in URI

			} #validate Parameters and obtain Cosm defaults
			catch { #report Validation Error
				Throw "$($MyInvocation.MyCommand.Name):: Parameter Validation Error $_ "
			}	#report Validation Error

			for ( ; $pbPage -le $pbMaxPage; $pbPage++){ # Loop to run extract(s) to fulfill request 
				write-Host "Retrieving Cosm History data from Feed $pbFeedId for period $pbExtractStartUtcDT to $pbExtractEndUtcDT "


				Try { # Setup Web client for retrieving data
					[String]$local:pbExtractStartUtcSTR=get-Date $local:pbExtractStartUtcDT -f u #format into UTC string
					[String]$local:pbExtractEndUtcSTR=get-Date $pbExtractEndUtcDT -f u
					$pbExtractStartUtcSTR = $pbExtractStartUtcSTR.replace(" ","T")
					$pbExtractEndUtcSTR = $pbExtractEndUtcSTR.replace(" ","T")
					
					#------------------------
					# Set Query parameters
					#------------------------
					$pbQueryHT.clear() # empty the query string
					if ($pbExtractStartUtcSTR -ne "") {$pbQueryHT.Add("start",$pbExtractStartUtcSTR) } #all data points after $pbExtractStartUtcSTR
					if ($pbExtractEndUtcSTR -ne "") {$pbQueryHT.Add("end",$pbExtractEndUtcSTR)  	}#and before $pbExtractEndUtcSTR
					if ($pbExtractDurationSTR -ne "") {$pbQueryHT.Add("duration",$pbExtractDurationSTR ) }#not needed if end is specified
					$pbQueryHT.Add("interval",$pbExtractIntervalINT)  #samples one value in each interval (usually the last in the interval?)
					if ($pbDatastreamId -ne $null) {$pbQueryHT.Add("datastreams",$pbDatastreamID)   }
					If ($pbTimezoneSTR -ne "") {$pbQueryHT.Add("timezone",$pbTimezoneSTR) } #careful timezone codes are case sensitive
					if ($pbPage -gt 1) {$pbQueryHT.Add("page",$pbPage) } #PN: this is a specific page.
					if ($pbPerPage -ne $null) {$pbQueryHT.Add("per_page",$pbPerPage) } #PN: this is max size of a page.) 
					if ($pbFindPrevious -ne $null) {$pbQueryHT.Add("find_previous","true")}  #get the first datapoint prior to the range being extracted
					if ($pbIntervalType -ne $null) {$pbQueryHT.Add("interval_type","discrete") } #Standardises the datapoint timestamp to the endpoint of the interval. Otherwise the actual timestamp is provided
					#& $List_hashtable $pbQueryHT
					$pbWC.QueryString = $pbQueryHT 	# Attach QueryString to the WebClient.
					write-debug "pbUriSTR=$pbUriSTR"
				} # Setup Web client for retrieving data
				catch {  # Report Web client setup Error
					Throw "$($MyInvocation.MyCommand.Name):: Web Client Setup Error $_ "
				}  # Report Web client Setup Error
						
				try { # Download Cosm Data
					$pbExtractedSTR = $pbWC.DownloadString($pbUriSTR) # rerieve all datapoints into a single string (max size about 30K)
					#TODO: change to openread followed by readline?
					Try { # Output Data to pipeline
						if ($pbformat -eq "csv"){ # Send CSV Output to pipeline as separate Datapoints
							#[string[]]$pbExtractedSTAL=$pbExtractedSTR -split "\n"
							[System.Collections.ArrayList]$pbExtractedSTA=$pbExtractedSTR -split "\n"
							$CountDatapoints=$pbExtractedSTA.count
							#find lastest datapoint timestamp for all datastreams
							$LastDatapointAt=$($pbExtractedSTA[$CountDatapoints-1] -split ",")[1]
							$DatapointAt=$LastDatapointAt
							#Have we finished extracting?
							if ($CountDatapoints -ge $pbPerPage) { # A full page has been received
								#NB. There are possibly more datapoints and the set of datapoints 
								#    at the latest time may not be complete for all datastreams
								#remove set of datapoints with the latest time as it may be incomplete
								for ($i = $CountDatapoints-1; $DatapointAt -eq $LastDatapointAt; $i--) {
									$pbExtractedSTA.removerange($i,1)								
									$DatapointAt=$($pbExtractedSTA[$i-1] -split ",")[1]
								}
								#reset the dates for another extract
								$pbExtractStartUtcDT=$(Get-Date $LastDatapointAt).touniversaltime() #make sure we re-pickup the last datapoint
								$pbExtractEndUtcDT= $pbExtractStartUtcDT + $pbExtractDurationTS
								if ($pbExtractEndUtcDT -ge $pbRequestEndUtcDT) {$pbExtractEndUtcDT = $pbRequestEndUtcDT} #dont get more than requested
							} # A full page has been received
							else { # the page is not full
								# this is used later as the condition to end the loop
							} # the page is not full
							$pbExtractedSTA |  foreach-object{"$pbFeedId,$_"} # Add Feedid to Datapoint and output to pipeline
						} # Send CSV Output to pipeline as separate Datapoints
						elseif  ($pbformat -eq "xml"){ # Send XML Output to pipeline as serialized string
							#$pbExtractedSTR
							#find lastest datapoint for all datastreams
							$xmlData=[xml]$pbExtractedStr
							$xmlData
							#$lastDatapoint=$xmldata.selectnodes('eeml/environment/data/datapoints[last()].get-attribute("at")')
						} # Send XML Output to pipeline as serialized string
						else { # Send Output to pipeline as string
							$pbExtractedSTR
						} # Send Output to pipeline as string
					} # Output Data to pipeline
					# if extract does not satisfy request change extract startDT
					Catch { # Report Output Data Error
						write-Host "problem outputing data from Cosm $_ `n $StackTrace"
					} # Report Output Data Error
				
				} # Download Cosm Data
				catch { # Report Download Cosm Data Error
					#write-Host "problem downloading data from Cosm $_ `n $StackTrace"
					#TODO: get the return code from $_ 
					$GetDataError=$_.exception.innerexception
					$Response=$GetDataError.response
					[string]$ResponseMessage=$GetDataError.message 
					[int]$ResponseStatusCode=$Response.statuscode
					[string]$ResponseUri=$Response.ResponseUri
					$ResponseHeaders=$response.headers
					write-Host $ResponseHeaders
					write-Host "Response Status Code: $ResponseStatusCode 	`nError Message: $ResponseMessage `nResponse Uri: $ResponseUri"
					write-Host "Response Headers:-"
					for ($i=0; $i -lt $ResponseHeaders.count; $i++) {
						write-Host "$($responseheaders.keys[$i].padleft(20)): $($responseheaders[$i])"
					}
				} # Report Download Cosm Data Error
			
				if ($CountDatapoints -lt $pbPerPage) { # there are no more datapoints to return
					break # exit the loop
				}  # there are no more datapoints to return
				elseIf ($pbPage -eq $pbMaxPage)  { # Have run out of pages
					Write-Host "Max Pages reached but still not all data retrieved. Last datapoint returned was $DatapointAt"
				} # have run out of pages
			
			} # Loop to run extract(s) to fulfill request 
			
		} # Get Cosm Data for current pipeline item
		Catch { # Report Get Cosm Data Error with StackTrace
			write-Host "problem getting data from Cosm $_ `n $StackTrace"
		} # Report Get Cosm Data Error with StackTrace
	}
	
	End {
		#$pbWC.dispose
		Write-Verbose "$($MyInvocation.MyCommand.Name):: Function ended "
	} 
}

