Connecting to SAS Data Sets
Connecting to SAS Data Sets
Below are example connection strings to SAS files or streams, using the server's default data modeling configuration (see below)
| Service provider | URI formats | Connection example |
| Local | Single File Path (One table) file://localPath | URI=C:/folder1; |
| Directory Path (one table per file) file://localPath | ||
| HTTP or HTTPS | http://remoteStream https://remoteStream | URI=http://www.host1.com/streamname1; |
| Amazon S3 | Single File Path (One table) s3://remotePath | URI=s3://bucket1/folder1; AWSSecretKey=secret1; AWSRegion=OHIO; |
| Directory Path (one table per file) s3://remotePath | ||
| Azure Blob Storage | azureblob://mycontainer/myblob | URI=azureblob://mycontainer/myblob; AzureStorageAccount=myAccount; AzureAccessKey=myKey; URI=azureblob://mycontainer/myblob; AzureStorageAccount=myAccount; InitiateOAuth=GETANDREFRESH; AuthScheme=OAuth; |
| Google Drive | Single File Path (One table) gdrive://remotePath gdrive://SharedWithMe/remotePath | URI=gdrive://folder1;InitiateOAuth=GETANDREFRESH; AuthScheme=OAuth; URI=gdrive://SharedWithMe/folder1;InitiateOAuth=GETANDREFRESH; AuthScheme=OAuth; |
| Directory Path (one table per file) gdrive://remotePath gdrive://SharedWithMe/remotePath | ||
| One Drive | Single File Path (One table) onedrive://remotePath onedrive://SharedWithMe/remotePath | URI=onedrive://folder1;InitiateOAuth=GETANDREFRESH; AuthScheme=OAuth; URI=onedrive://SharedWithMe/folder1;InitiateOAuth=GETANDREFRESH; AuthScheme=OAuth; |
| Directory Path (one table per file) onedrive://remotePath onedrive://SharedWithMe/remotePath | ||
| Box | Single File Path (One table) box://remotePath | URI=box://folder1; InitiateOAuth=GETANDREFRESH; AuthScheme=OAuth; |
| Directory Path (one table per file) box://remotePath | ||
| Dropbox | Single File Path (One table) dropbox://remotePath | URI=dropbox://folder1; InitiateOAuth=GETANDREFRESH; AuthScheme=OAuth; OAuthClientId=oauthclientid1; OAuthClientSecret=oauthcliensecret1; CallbackUrl=http://localhost:12345; |
| Directory Path (one table per file) dropbox://remotePath | ||
| SharePoint SOAP | Single File Path (One table) sp://remotePath | URI=sp://Documents/folder1; User=user1; Password=password1; StorageBaseURL=https://subdomain.sharepoint.com; |
| Directory Path (one table per file) sp://remotePath | ||
| SharePoint REST | Single File Path (One table) sprest://remotePath | URI=sprest://Documents/folder1; InitiateOAuth=GETANDREFRESH; AuthScheme=OAuth; StorageBaseURL=https://subdomain.sharepoint.com; |
| Directory Path (one table per file) sprest://remotePath | ||
| FTP or FTPS | Single File Path (One table) ftp://server:port/remotePath ftps://server:port/remotepath | URI=ftps://localhost:990/folder1; User=user1; Password=password1; |
| Directory Path (one table per file) ftp://server:port/remotePath ftps://server:port/remotepath; | ||
| SFTP | Single File Path (One table) sftp://server:port/remotePath | URI=sftp://127.0.0.1:22/folder1; User=user1; Password=password1; URI=sftp://127.0.0.1:22/folder1; SSHAuthmode=PublicKey; SSHClientCert=myPrivateKey |
| Directory Path (one table per file) sftp://server:port/remotePath | ||
| Azure Data Lake Store Gen1 | adl://remotePath adl://Account.azuredatalakestore.net@remotePath | URI=adl://folder1; InitiateOAuth=GETANDREFRESH; AuthScheme=OAuth; AzureStorageAccount=myAccount; AzureTenant=tenant; URI=adl://myAccount.azuredatalakestore.net@folder1; InitiateOAuth=GETANDREFRESH; AuthScheme=OAuth; AzureTenant=tenant; |
| Azure Data Lake Store Gen2 | abfs://myfilesystem/remotePath abfs://[email protected]/remotepath | URI=abfs://myfilesystem/folder1; AzureStorageAccount=myAccount; AzureAccessKey=myKey; URI=abfs://[email protected]/folder1; AzureAccessKey=myKey; |
| Azure Data Lake Store Gen2 with SSL | abfss://myfilesystem/remotePath abfss://[email protected]/remotepath | URI=abfss://myfilesystem/folder1; AzureStorageAccount=myAccount; AzureAccessKey=myKey; URI=abfss://[email protected]/folder1; AzureAccessKey=myKey; |
| Wasabi | Single File Path (One table) wasabi://bucket1/remotePath | URI=wasabi://bucket/folder1; AccessKey=token1; SecretKey=secret1; Region='us-west-1'; |
| Directory Path (one table per file) wasabi://bucket1/remotePath | ||
| Google Cloud Storage | Single File Path (One table) gs://bucket/remotePath | URI=gs://bucket/folder1; InitiateOAuth=GETANDREFRESH; AuthScheme=OAuth; ProjectId=test; |
| Directory Path (one table per file) gs://bucket/remotePath | ||
| Oracle Cloud Storage | Single File Path (One table) os://bucket/remotePath | URI=os://bucket/folder1; AccessKey='myKey'; SecretKey='mySecretKey'; OracleNameSpace='myNameSpace' Region='us-west-1'; |
| Directory Path (one table per file) os://bucket/remotePath | ||
| Azure File | Single File Path (One table) azurefile://fileShare/remotePath | URI=azurefile://bucket/folder1; AzureStorageAccount='myAccount'; AzureAccessKey='mySecretKey'; URI=azurefile://bucket/folder1; AzureStorageAccount='myAccount'; AzureSharedAccessSignature='mySharedAccessSignature'; |
| Directory Path (one table per file) azurefile://fileShare/remotePath | ||
| IBM Object Storage Source | Single File Path (One table) ibmobjectstorage://bucket1/remotePath | URI=ibmobjectstorage://bucket/folder1; AuthScheme='IAMSecretKey'; AccessKey=token1; SecretKey=secret1; Region='eu-gb'; URI=ibmobjectstorage://bucket/folder1; ApiKey=key1; Region='eu-gb'; AuthScheme=OAuth; InitiateOAuth=GETANDREFRESH; |
| Directory Path (one table per file) ibmobjectstorage://bucket1/remotePath | ||
| Hadoop Distributed File System | Single File Path (One table) webhdfs://host:port/remotePath | URI=webhdfs://host:port/folder1 |
| Directory Path (one table per file) webhdfs://host:port/remotePath | ||
| Secure Hadoop Distributed File System | Single File Path (One table) webhdfss://host:port/remotePath | URI=webhdfss://host:port/folder1 |
| Directory Path (one table per file) webhdfss://host:port/remotePath |
Accessing Sub-Folders
Set the following properties to model subfolders as views:
- IncludeSubdirectories: Set this to read files and Schema.ini from nested folders. In the case of a name collision, table names are prefixed by underscore-separated folder names. By default this is false.
- DirectoryRetrievalDepth: Set this to specify how many subfolders will be recursively scanned when IncludeSubdirectories is set. By default, the server scans all subfolders.
When IncludeSubdirectories is set, the automatically detected table names follow the convention below:
| File Path | Root\subfolder1\tableA | Root\subfolder1\subfolder2\tableA |
| Table Name | subfolder1_tableA | subfolder1_subfolder2_tableA |