Lesson 01

Using SAS Intelligent Decisioning on SAS Viya 4
Lesson 01, Section 0 Exercise: 00 020 Validate Workshop Environment

Validate the Workshop Environment

For this SAS Viya Workshop, you will have access to your own SAS Viya environment deployed on Microsoft Azure. You will also get one dedicated Windows client machine and one Linux machine on the RACE environment (SAS Data Center). These 2 machines will help you get access to the Azure environment (act as jump hosts).

Your access into the environment will be through the Windows machine. From that Windows machine, you will use the browser and terminal emulation software to access the backend environment.

When Is My SAS Viya Deployment Ready?

After you book the workshop collection in the RACE application, many steps come into play:

  • RACE machines are started.
  • Numerous tools are being installed on the RACE machine(s).
  • SAS Viya is deployed on Azure, configured and customized.

This takes time, usually between 60 and 90 minutes.

Do NOT open Chrome until the collection is ready. This will prevent Chrome from being updated correctly.

Checking the Windows Client

  • Connect to the Workshop Windows Client, give it a few seconds to initialize, then check for the following background message:

or

If you don’t see the Preparing environment or the Waiting for SAS Viya message, you have probably signed-in too early to the Windows Client.

Sign-out using the following method (do NOT close your Remote Desktop window!) and come back later (30 minutes after the reservation email):

If you see one of the two messages, then your Windows Client is running as expected.

You can monitor the progress of the Workshop Collection in details (next step) or you can simply wait for it to be ready.

The background will be updated once the Workshop Collection is fully ready:

And the following dialog box will be displayed:

Again, this can take 60 to 90 minutes after the reservation email notice.

Bookmarks

Click OK and the SAS Viya Login page will open automatically in Chrome:

Access via Reserved Client Machine

At times it may be easier and/or necessary to access the hands-on exercise and associated assets directly from the reserved client machine. You can access them using the yellow Chrome shortcut created on the desktop:

Or if Chrome is already open, you can use the Hands-On Instructions bookmark:

Client Browser to Hands-on Exercises

From this client machine you can copy paste the instructions directly from the hands-on document.

Monitor the progress of the Workshop Collection

Optional if you simply prefer to wait for the Windows dialog box (previous step).

Recommended if the wait has been too long and/or you want to see what’s going on.

  • On your Workshop Windows Client, open MobaXterm (1), open a new sasnode01 session (2), copy the gellow_tail command (3) and paste it on the prompt (4):
gellow_tail

  • Press Enter

You’ll see the progress of the Workshop Collection configuration. The collection will be ready when you see this final message:

11/19/21 15:18:03: #####################################################################################
11/19/21 15:18:03: ####### DONE WITH THE BOOTSTRAPPING OF THE MACHINE ##################################
11/19/21 15:18:03: #####################################################################################

You can then further check that everything went fine (no error) and see the URLs to be used:

  • Press Ctrl-C to exit the tail command

Log in SAS Viya

Once it has been confirmed that SAS Viya has been deployed successfully, you can test it.

For click-by-click instructions, watch the demonstration: Demo: LogIn SAS Viya

URL

  • In sasnode01 MobaXterm session, RUN:
cat ~/urls.md

or,

  • Open Chrome (if it’s not already open) and click on the SAS Viya bookmark.

User

You will be using geldmui@gelenable.sas.com.

Pass

We need the password of the geldmui@gelenable.sas.com user that we will be using to log on in both SAS Viya and the Azure Portal.

Run the following command on the sasnode01 session:

getpw

Expected result:

Copy (no Ctrl-C, just highlight the entire string) the password in a notepad for later use.

We will use Azure Single Sign-On using OpenID. Click the Log-in with OpenID link:

If this is the first time you log on to Microsoft Azure, you will be prompted for the user id and password (in Microsoft Azure dialogs):

The user is: geldmui@gelenable.sas.com

The password is the string you get from the getpw command on the sasnode01 MobaXterm session.

You can stay signed-in:

Opt-in for administrative rights in SAS Viya:

Validate you have access to various SAS applications.

Conclusion

You validated your SAS Viya deployment and you now know how to login.

End


Using SAS Intelligent Decisioning on SAS Viya 4
Lesson 01, Section 0 Exercise: 00 030 PVCs Pods Check

Environment Checks

The hands-on will be using certain SAS Viya pods, persistent volume claims and mounts. You should check them before moving further.

Check Persistent Volume Claims PVCs

In Kubernetes, a PVC is associated with a PV, and a Pod can mount a PVC. It’s that simple!

A persistent volume (PV) is the physical volume on the host machine that stores your persistent data. A persistent volume claim (PVC) is a request for the platform to create a PV for you, and you attach PVs to your pods via a PVC.

In sasnode01 MobaXterm session, RUN:

kubectl get pvc

We want to see all the PVCs with a status Bound.

For our workshop, for Python exercises and publishing destinations, we need the following PVCs:

  • sas-pyconfig and sas-microanalytic-score-astores is needed for Python exercises.
  • sas-model-publish-git is needed for Publishing to Git exercises.
  • sas-model-publish-kaniko is needed for Publishing to Azure exercises.

We need them Bound with a storage class sas-nfs.

The same for the cas- PVCs.

Python Validation

To make installing and managing open-source installations easier for SAS Viya administrators, SAS provides the SAS Configurator for Open Source.

From SAS Viya stable version 2023.09, we’re installing Python using the SAS Configurator for Open Source.

How It Works

Once configured, the SAS Configurator for Open Source creates and executes a sas-pyconfig job that:

  • Downloads the source, signature file, and signer’s key from the configured location.
  • Verifies the authenticity of the Python source using the signer’s key and signature file.
  • Extracts the Python sources into a temporary directory for building.
  • Configures and performs a make of the Python sources.
  • Installs the Python builds within the PVC and updates supporting components, such as PIP, if applicable.
  • Builds and installs configured packages for Python.
  • If everything completes successfully, creates the symbolic links, or changes the symbolic links’ targets, to point to the latest Python builds.

How to Check

Log in SAS Viya, go to SAS Studio (Develop Code and Flows):

Open a Python program:

Type:

print(1+2)
import pandas as pd

You should see the following, no errors:

OPTIONAL: Check Pods

SAS Micro Analytic Service

In the sasnode01 MobaXterm session, RUN:

kubectl get pods | grep sas-mi

You should see the status Running:

sas-microanalytic-score-6c6d87844c-rds7b                          2/2     Running     13 (32m ago)   98m

Describe the pods and scroll:

MASPOD=$(kubectl -n $GELENV_NS get pods | grep microanalytic | awk 'NR==1{print $1}')
kubectl -n $GELENV_NS describe pod $MASPOD
kubectl get pods | grep sas-mi

We want to see a python volume and an astores volume:

SAS Model Publish

kubectl get pods | grep sas-model-publish
PUBPOD=$(kubectl -n $GELENV_NS get pods | grep sas-model-publish | awk 'NR==1{print $1}')
kubectl -n $GELENV_NS describe pod $PUBPOD

You will need the following mounts and PVCs:

Conclusion

This validates you have a SAS Viya deployment correctly configured.

End


Lesson 03

Using SAS Intelligent Decisioning on SAS Viya 4
Lesson 03, Section 0 Exercise: 02 021 Rule Set

Create a Rule Set to Support an Auto Auction Decision

In this hands-on you will create a simple rule set in SAS Intelligent Decisioning.

Log in to SAS Viya

Connect to SAS Viya via URL with User and Pass.

SAS Studio

Switch to SAS Studio:

  • Choose from the Applications menu, Develop Code and Flows or
  • Add /SASStudio/ after the SAS Viya URL.

Build the Test Data

1. Run the following code in SAS Studio to build an input table to support the development of the Rule Sets and Decisions.

cas mysession sessopts=(metrics=true);
caslib _all_ assign;

proc casutil incaslib='casuser' outcaslib='casuser';
    droptable casdata='AutoAuctionInput' quiet;
quit;

options dscas;

data CASUSER.AutoAuctionInput (promote=yes);
length Make $20 Model $20 state $2 Year 8 BlueBookPrice 8 CurrentBid 8 Miles 8 OriginalInvoice 8 OriginalMSRP 8 Miles 8 VIN $17;
Make="Honda"; Model="Accord"; State="LA"; Year=2009; BlueBookPrice=5000; CurrentBid=3000; OriginalInvoice=30000; OriginalMSRP=35000; Miles=50000; vin="12345678901234567"; output;
Make="Kia";   Model="Soul";   State="CA"; Year=2016; BlueBookPrice=8000; CurrentBid=9000; OriginalInvoice=18000; OriginalMSRP=19500; Miles=68000; vin="12345678901234568"; output;
Make="Honda"; Model="Civic";  State="AR"; Year=2017; BlueBookPrice=28000; CurrentBid=20000; OriginalInvoice=32000; OriginalMSRP=34000; Miles=20000; vin="12345678901234569"; output;
Make="Ford";  Model="Fusion"; State="CA"; Year=2012; BlueBookPrice=9000; CurrentBid=9000; OriginalInvoice=18000; OriginalMSRP=19500; Miles=70000; vin="12345678901234560"; output;
Make="Honda"; Model="Pilot";  State="MN"; Year=2012; BlueBookPrice=10000; CurrentBid=3000; OriginalInvoice=45000; OriginalMSRP=50000; Miles=100000; vin="12345678901234561"; output;
Make="Tesla"; Model="X100D";  State="CA"; Year=2017; BlueBookPrice=80000; CurrentBid=90000; OriginalInvoice=100000; OriginalMSRP=100000; Miles=5000; vin="12345678901234562"; output;
Make="Honda"; Model="CRV";    State="PA"; Year=2009; BlueBookPrice=12000; CurrentBid=8000; OriginalInvoice=30000; OriginalMSRP=35000; Miles=270000; vin="12345678901234563"; output;
Make="Buick"; Model="Regal";  State="NJ"; Year=2012; BlueBookPrice=8000; CurrentBid=7000; OriginalInvoice=35000; OriginalMSRP=40500; Miles=82000; vin="12345678901234564"; output;
Make="BMW";   Model="328i";   State="NY"; Year=2015; BlueBookPrice=35000; CurrentBid=40000; OriginalInvoice=55000; OriginalMSRP=60000; Miles=4000; vin="12345678901234565"; output;
Make="Scion"; Model="TC";     State="PA"; Year=2016; BlueBookPrice=10000; CurrentBid=9000; OriginalInvoice=18000; OriginalMSRP=19500; Miles=20000; vin="12345678901234566"; output;
Make="Honda"; Model="Accord"; State="MA"; Year=2010; BlueBookPrice=12000; CurrentBid=11000; OriginalInvoice=34000; OriginalMSRP=35000; Miles=80000; vin="12345678901234571"; output;
Make="Ford";  Model="F150";   State="FL"; Year=2016; BlueBookPrice=45000; CurrentBid=46000; OriginalInvoice=68000; OriginalMSRP=79500; Miles=90000; vin="12345678901234572"; output;
Make="GMC";   Model="Terrain";State="SC"; Year=2015; BlueBookPrice=40000; CurrentBid=30000; OriginalInvoice=60000; OriginalMSRP=65000; Miles=40000; vin="12345678901234573"; output;
Make="Ford";  Model="Fusion"; State="CA"; Year=2012; BlueBookPrice=8000; CurrentBid=9000; OriginalInvoice=18000; OriginalMSRP=19500; Miles=59000; vin="12345678901234574"; output;
run;

* Save as .sashdat that can be loaded every day;
proc casutil incaslib=casuser outcaslib=casuser;
    save casdata='AutoAuctionInput' casout='AutoAuctionInput.sashdat' copies=0 replace;
run;
quit;

2. For click-by-click instructions, watch the demonstration: Demo: RuleSet 01

3. Confirm the input table has been created:

AutoAuctionInput

Terminate your CAS Session

Run only the following code to terminate the CAS session.

cas mysession terminate;

SAS Intelligent Decisioning

Switch to SAS Intelligent Decisioning:

  • Choose Build Decisions from the Applications menu or

  • Add /SASDecisionManager/ after the SAS Viya URL.

  • Go to Rule Sets.

  • New Rule Set.

Define the Rule Set and Add Variables

Objectives:

1. Create a rule set with the attributes below:

  • Name: autoAuction
  • Type: Assignment
  • Location: /MyFolder

2. Include all the variables from the CASUSER.AutoAuctionInput table you created in the previous step.

3. Add two custom variables defined as below:

Variable Name Data Type Input Output
Bid Boolean unchecked checked
callOffice Boolean unchecked checked

4. For click-by-click instructions, watch the demonstration: Demo: RuleSet 02

Add Rules to the Rule Set

1. Add three rules to the rule set as shown in the screen shot below:

autoAuction1_0

The sequence:

  • Add assignment: ASSIGN Bid FALSE.
  • Add assignment: ASSIGN callOffice FALSE.
  • + Add Rule: blueBookRule
    • IF branch: IF CurrentBid < BlueBookPrice
    • + on the same line Add a condition
    • AND Miles < 50000
    • THEN ASSIGN Bid TRUE.
  • + Add Rule: change IF to ELSE:
    • ELSE branch: ELSE CurrentBid < 1.2 * BlueBookPrice
    • + on the same line Add a condition
    • AND Miles < 20000
    • THEN ASSIGN Bid TRUE.

Save the rule set.

2. For click-by-click instructions, watch the demonstration: Demo: RuleSet 03

Test the Rule Set

1. Using the CASUSER.AutoAuctionInput table you created, test that the rule set is working as expected.

Go to Scoring > Tests > New Test

  • Select CASUSER.AutoAuctionInput:

  • Run.
  • Results:

  • Output:

2. For click-by-click instructions, watch the demonstration: Demo: RuleSet 04

Solution

As an alternative import the solution from 02_021_Rule_Sets_autoAuction1_0.json:

See 99_010_ImportSolution for more details:

  • With sasnode01 download the file from /home/cloud-user/PSGEL267-using-sas-intelligent-decisioning-on-sas-viya/solutions
  • Import it with SAS Environment Manager.
  • Create the AUTOAUCTIONINPUT table, as shown above.
  • Go to SAS Intelligent Decisioning and test the rule set.

Using SAS Intelligent Decisioning on SAS Viya 4
Lesson 03, Section 0 Exercise: 02 031 Lookup Table

Create and Use Lookup Tables to Support an Auto Auction Decision

Create lookup tables used in rule sets.


Complete the Hands-On Exercises

If you haven’t completed the previous hands-on exercises, do it now. This exercise builds on 02_21_Rule_Set.

As an alternative import the solution from 02_021_Rule_Sets_autoAuction1_0.json:

See 99_Admin\99_010_ImportSolution for more details.


Log in to SAS Viya

Connect to SAS Viya via URL with User and Pass.

Create an Unwanted Makes Lookup Table

Unwanted Makes: Car brands that you don’t want to bid on unless they are very cheap.


Go to SAS Intelligent Decisioning (Build Decisions):

1. Create a Lookup table with the following attributes:

New Lookup Table:

  • Name: unwantedMakes
  • Location: /MyFolder
  • Leave Key and Value labels empty.
  • Save.

2. New Entries > Give the lookup table the following entries:

Key Value
Scion Scion
Buick Buick

3. Activate the lookup table.

When you activate, the version number changes.

4. Close.

For click-by-click instructions, watch the demonstration: Demo: Lookup 01

Update: in the latest LTS, the screen looks slightly different.


Create a bidCommands Lookup Table

Bid Commands: Labels for the TRUE/FALSE (1/0) Boolean Bid instruction.


1. Create and activate a second lookup table:

New Lookup Table:

  • Name: bidCommands
  • Location: /MyFolder
  • Leave Key and Value labels empty.
  • Save.

2. New Entries > Give the lookup table the following entries:

Key Value
0 Do NOT bid on this car!
1 Bid on the car!

3. Activate the lookup table.

When you activate, the version number changes.

4. Close.

For click-by-click instructions, watch the demonstration: Demo: Lookup 02

Note: in the latest LTS, the screen looks slightly different.


Create a New Rule Set

  • Go to Rule Sets.

1. Create a new rule set that uses the lookup tables to support an auto auction bidding scheme. Give the rule set these attributes:

  • Name: autoAuctionMake
  • Type: Assignment
  • Location: /MyFolder

2. Add all the variables from the rule set autoAuction rule set (previous exercise).

3. Add one custom variable defined as below:

Variable Name Data Type Input Output
bidCommand Character unchecked checked


4. Add the following logic:

The sequence:

  • Add Rule: IF-THEN ELSE statement
  • Name rule: makeRule
    • IF branch: IF Make LOOKUP
    • Select a lookup table: From My Folder choose unwantedMakes
    • THEN ASSIGN Bid FALSE

  • + Add Rule: change IF to ELSE
    • ELSE line: Delete the condition -> Recycle Bin icon.
    • THEN line: THEN ASSIGN Bid TRUE

  • Add assignment:
    • Change ASSIGN to LOOKUPVALUE
    • LOOKUPVALUE bidCommand
    • Select a lookup table: From My Folder choose bidCommands
    • Change the lookup variable to Bid.

You should see:

  • Save.

5. For click-by-click instructions, watch the demonstration: Demo: Lookup 03


Test the New Rule Set

1. Using the CASUSER.AutoAuctionInput table you created, test that the rule set is working as expected.

Go to Scoring > Tests > New Test

  • Input table: CASUSER.AutoAuctionInput:

  • Run.
  • Results:

2. Your test result should look like this:

  • Click on the icon under Results:

3. Refer to the section Test your Rule Set, of the previous exercise if you need assistance in testing the new rule set.


Lesson 04

Using SAS Intelligent Decisioning on SAS Viya 4
Lesson 04, Section 0 Exercise: 03 061 Decision

Create a Decision to support an Auto Auction Participant

Create a decision using rule sets, code and branches.


Complete the Hands-On Exercises

If you haven’t completed the previous hands-on exercises, do it now. This exercise builds on those.

  • 02_21_Rule_Set
  • 02_31_Lookup_Table

As an alternative import the solution from 02_031_Lookup_Table.json:

See 99_Admin\99_010_ImportSolution for more details.

Activate Lookup Tables

You need to activate each of the lookup tables, manually, before continuing.

Log in to SAS Viya

Connect to SAS Viya via URL with User and Pass.

Create the Decision

1.Within SAS Intelligent Decisioning, create a new decision. Name it autoAuctionDec. Save it to /My Folder:

2. Add all the variables of the autoAuction rule set.

3. Add ONLY the variable bidCommand of the autoAuctionMake rule set (exercise 1).

4. Design the decision as shown below:

5. In the Decision Flow:

  • Under Start, add Branch below:

  • Branch, choose Equals:

  • Branch expression: state:

  • Paths: +. Under value add ‘PA’:

  • autoAuction and autoAuctionMake are the rule sets you created in the previous exercises.
  • On the PA branch, click Add nodes here and add rule set autoAuctionMake from My Folder:

  • On the Other branch, click Add nodes here and add rule set autoAuction from My Folder:

  • Click on End and add above

  • californiaOverride is a DS2 Code file. Its DS2 code is shown below.

  • New DS2 Code File, name it californiaOverride.
  • Open and use the following code:

package "${PACKAGE_NAME}" /inline;
   method execute(in_out double Bid,
                  in_out varchar bidCommand,
                  in_out varchar state);
    if state = 'CA' then do;
       Bid = 1;
       bidCommand = 'Buy anything from California!';
   end;
   end;
endpackage;
  • Validate and save the DS2 file.

  • Validate and save the decision.

6. For click-by-click instructions, watch the demonstration: Demo: Decision 01


Test the Decision

1. Test the Decision using the Scoring tab to ensure the Decision works using the CASUSER.AUTOAUCTIONINPUT CAS table you created in the first hands-on exercises.

2. Decisions are tested in exactly the same manner as rule sets. For a reminder on the click-path, see the section Test your Rule Set, of the previous 02_Rule_Set exercise:

  • Go to Scoring > Tests > New Test
  • Select CASUSER.AutoAuctionInput:
  • Run.
  • Results.
  • Output:

3. Examine the results to see the decision logic was correctly applied.

  • Move the state next to bidCommand.

Note:

  • Records with a state value of PA were scored with the autoAuctionMake rule set.

  • Records with a non-PA state were scored with the autoAuction rule set.

  • Records with a state value of CA output values were overwritten by the californiaOverride DS2 logic.

  • The bidCommand field is blank for non-California and non-Pennsylvania records that went down the Other branch because the autoAuction rule set did not set this variable. (Feel free to remedy this if you like, by modifying the DS2 code file.)



Using SAS Intelligent Decisioning on SAS Viya 4
Lesson 04, Section 0 Exercise: 03 062 Model

Add an Analytic Model to a Decision to support an Auto Auction Participant

Add a model to an existing decision.

Complete the Hands-On Exercises

If you haven’t completed the previous hands-on exercises, do it now.

  • 02_21_Rule_Set
  • 02_31_Lookup_Table
  • 03_061_Decision


As an alternative, import the solutions from 03_061_Decision_autoAuctionDec.json:

For a reminder on the click-path, see 99_Admin\99_010_ImportSolution for more details.

You need to activate each of the lookup tables, manually, in SAS Intelligent Decisioning, before continuing.


Create a (Trivial) Predictive Model

1. Using Model Studio (Build Models), create a trivial predictive model using the AUTOAUCTIONINPUT table as input. The model, a simple linear regression, predicts the blueBookPrice from input variables.

  • From Projects > New project:

  • Name: carScore
  • Type: Data Mining and Machine Learning
  • Template: Browse: Basic template for interval target
  • Data: CASUSER.AUTOAUCTIONINPUT
  • Save.

Variables:

When the project is created, open it. From Data set the following variable assignments:

Variable Role Level Order
BlueBookPrice Target Interval
CurrentBid Input Interval
Miles Input Ordinal Ascending
OriginalInvoice Input Interval
OriginalMSRP Input Interval
VIN Key Nominal
Year Input Ordinal Descending

Leave the other settings to their defaults.

Pipeline:

  • Open the pipeline.

  • Click on the link between Imputation and Linear Regression.

  • Insert: Data Mining Preprocessing > Variable Selection:

  • Variable Selection: From Properties > Combination Criterion > select Linear Regression Selection:

  • Run the pipeline.

  • After the pipeline completes, in a few minutes, go to the Pipeline Comparison tab, click the three dots and select Register Models:

  • Accept the default location, /Model Repositories/DMRepository.

  • Optional: You can verify in SAS Model Manager (Manage Models from the hamburger menu), that your model was registered.

2. For click-by-click instructions, watch the demonstration: Demo: Model 01


Integrate the Model into your Decision

In SAS Intelligent Decisioning integrate the model into your decision as shown below:


Steps:

1. Open autoAuctionDec decision:

2. Add two custom variables, output only, corresponding to the model output:

Variable Type Input Output
EM_PREDICTION Decimal unchecked checked
P_BlueBookPrice Decimal unchecked checked

3. Under Start, add the model:

  • Select from DMRepository > carScore > Linear Regression until the OK button is available.

4. Map the new variables:

  • In autoAuction rule set input variables, map BlueBookPrice with EM_PREDICTION. The model output becomes the rule set input:

  • Save the decision.

For click-by-click instructions, watch the demonstration: Demo: Model 02


Note: You could also modify the Rule Set to explicitly utilize the EM_PREDICTION field by modifying the blueBookRule’s condition to CurrentBid < EM_PREDICTION. However, if you modify the rule set, it will no longer function with the AUTOAUCTIONINPUT data set.


Test the Decision

1. Test the Decision using the Scoring tab to ensure the Decision works using the CASUSER.AUTOAUCTIONINPUT CAS table you created in the first hands-on exercises.

2. Decisions are tested in exactly the same manner as rule sets. For a reminder on the click-path, see the section Test your Rule Set, of the previous 02_Rule_Set exercise:

  • Go to Scoring > Tests > New Test
  • Select CASUSER.AutoAuctionInput:
  • Run.
  • Results.
  • Output:

3. At the end you should see:


Lesson 05

Using SAS Intelligent Decisioning on SAS Viya 4
Lesson 05, Section 0 Exercise: 04 031 Publish CAS

Publish a Decision to CAS

Create a publishing destination and publish a decision.

Note: SAS Intelligent Decisioning can publish rule sets as well as decisions.

Complete the Hands-On Exercises

If you haven’t completed, the previous hands-on exercises, do it now. This exercise builds on those.

  • 02_21_Rule_Set.
  • 02_31_Lookup_Table.
  • 03_061_Decision.
  • 03_062_Model.

As an alternative, to save time:

  • Download from solutions: 03_062_Model_carScore_Project and 03_062_Model_myFolder.

  • Create the CAS table mentioned in 02_21_Rule_Set or after importing 03_062_Model_myFolder, go to SAS Studio and run Load_AutoAuctionInput.sas.
  • Import the solutions:
    • 03_062_Model_carScore_Project then
    • 03_062_Model_myFolder.
  • For a reminder on the import click-path, see 99_010_ImportSolution for more details.
  • Go to SAS Intelligent Decisioning:
    • Activate each of the two lookup tables: unwantedMakes and bidCommands.
    • Open autoAuctionDec.
    • Validate and save the decision.
    • Test the decision. Make sure there are no errors.


For click-by-click instructions, watch the demonstration: Demo: ImportSolution 02


Log in to SAS Viya

Connect to SAS Viya via URL with User and Pass.

Create the CAS Publishing Destination

Configuration Steps:

  • Sign in SAS Environment Manager as a SAS Administrator.
  • You must have write access to a global caslib.
  • Click Publishing Destinations in the navigation bar. The Publishing Destinations page appears.
  • Click New. The New Publishing Destination window appears.
  • Type: CAS
  • Name: casDecision
  • Description: Decisions published to CAS
  • CAS server: choose the default
  • CAS library: Public
  • Model table: choose the default setting.



Publish the Decision to CAS

In SAS Intelligent Decisioning:

1. Publish the autoAuctionDec decision you created in the previous exercises to CAS.

2. From any tab within the Decision, push the Publish button and follow the prompts.

3. Select the CAS publishing destination. Go with the default name autoAuctionDec1_0.



4. If the publication fails, try again and check Replace.

5. For click-by-click instructions, watch the demonstration: Demo: Publish CAS 01

Update: in the latest LTS, the publish screen looks slightly different.

Validate the Published Decision

1. Validate the published decision runs successfully in CAS.

2. From the Scoring > Publishing Validation tab, run the validation job that was created during publication.

3. Inspect the output and code used to run the published decision in CAS.

4.For click-by-click instructions, watch the demonstration: Demo: Publish CAS 02



Using SAS Intelligent Decisioning on SAS Viya 4
Lesson 05, Section 0 Exercise: 04 041 Publish SAS Micro Analytic Service

Publish a Decision to SAS Micro Analytic Service

Complete the Hands-On Exercises

If you haven’t completed the previous hands-on exercises, do it now. This exercise builds on those.

  • 02_21_Rule_Set.
  • 02_31_Lookup_Table.
  • 03_061_Decision.
  • 03_062_Model.

As an alternative, to save time:

  • Download from solutions: 03_062_Model_carScore_Project and 03_062_Model_myFolder.

  • Create the CAS table mentioned in 02_21_Rule_Set or after importing 03_062_Model_myFolder, go to SAS Studio and run Load_AutoAuctionInput.sas.
  • Import the solutions:
    • 03_062_Model_carScore_Project then
    • 03_062_Model_myFolder.
  • For a reminder on the import click-path, see 99_010_ImportSolution for more details.
  • Go to SAS Intelligent Decisioning:
    • Activate each of the two lookup tables: unwantedMakes and bidCommands.
    • Open autoAuctionDec.
    • Validate and save the decision.
    • Test the decision. Make sure there are no errors.


For click-by-click instructions, watch the demonstration: Demo: ImportSolution 02


Publish the Decision to SAS Micro Analytic Service

1. Publish the autoAuctionDec decision you created in the previous exercises to SAS Micro Analytic Service.

2. From any tab within the Decision, push the Publish button and follow the prompts.

3. Select the SAS Micro Analytic Service publishing destination. Go with the default name autoAuctionDec1_0.



4. Select the SAS Micro Analytic Service publishing destination and follow the UI.

5. If the publication fails, try again and check Replace.

6. For click-by-click instructions, watch the demonstration: Demo: Publish MAS 01

Update: in the latest LTS, the publish screen looks slightly different.


Validate the Published Decision

1. Validate the published decision runs properly in the SAS Micro Analytic Service.

2. From the Scoring > Publishing Validation tab, choose the SAS Micro Analytic Service validation job that was created during publication, then run it.

3. Inspect the output and code used to run the published decision in the SAS Micro Analytic Service.

4. Note the REST endpoint used to call the SAS Micro Analytic Service service as well as the request body message sent to the endpoint.

5. For click-by-click instructions, watch the demonstration: Demo: Publish MAS 02



Using SAS Intelligent Decisioning on SAS Viya 4
Lesson 05, Section 0 Exercise: 04 051 Publish Azure 1

Create Azure Publishing Destination

Create an Azure Publishing Destination.

Complete the Hands-On Exercises

If you haven’t completed, the previous hands-on exercises, do it now. This exercise builds on those.

  • 02_21_Rule_Set.
  • 02_31_Lookup_Table.
  • 03_061_Decision.
  • 03_062_Model.

As an alternative, to save time:

  • Download from solutions: 03_062_Model_carScore_Project and 03_062_Model_myFolder.

  • Create the CAS table mentioned in 02_21_Rule_Set or after importing 03_062_Model_myFolder, go to SAS Studio and run Load_AutoAuctionInput.sas.
  • Import the solutions:
    • 03_062_Model_carScore_Project then
    • 03_062_Model_myFolder.
  • For a reminder on the import click-path, see 99_010_ImportSolution for more details.
    • Activate each of the two lookup tables: unwantedMakes and bidCommands.
    • Open autoAuctionDec.
    • Validate and save the decision.
    • Test the decision. Make sure there are no errors.


For click-by-click instructions, watch the demonstration: Demo: ImportSolution 02


Puzzle Pieces

To create the publishing destination you will need:

Azure Components

Component Variable / Value Status
Tenant ID $TENANTID Exists
Subscription ID $SUBSCRPTN Exists
Prefix $PREFIX Exists
Prefix no dash $PREFIXNODASH Exists
Container Registry ${PREFIXNODASH}acr To create
Resource group RG Exists
Kubernetes Service $PREFIX-aks Exists
Location $LOCATION Exists
App registration psgel267-app Pre-created
App client id $APP_CLIENT_ID Pre-created
App client id $APP_CLIENT_SECRET Pre-created
App client id BASE64 encoded $APP_CLIENT_ID_BASE64 To create
App client id BASE64 encoded $APP_CLIENT_SECRET_BASE64 To create
Destination Name testACR To create
Destination Description Test ACR To create

Azure Actions

Component Type Status
App RBAC ACR Contributor To create
Attach ACR to AKS To create
Node-Port NSG rule Inbound To create

SAS Components

Component Value Status
Viya INGRESS_FQDN or Viya URL ${RG}.gelenable.sas.com Exists
User $SASUSER Exists
Password $AZUPW Exists
Credential Domain Name ACRCredDomainRho To create
Credential Domain Description Azure ACR credentials Rho version To create

SAS Actions

Component Type Status
Credential Domain To create
Publishing Destination To create

Azure Resources

For click-by-click instructions, watch the demonstration: Demo: Publish Azure 01


Gather Workshop Variables

Run the following command on the sasnode01 MobaXterm session:

# Get your USER Prefix
getprefix

# AZ LOGIN
azlogin

# Get workshop variables
RG=$(cat ${GELENABLE_DIR}/vars/variables.txt | grep resource-group | awk -F'::' '{print $2}')
LOCATION=$(cat ${GELENABLE_DIR}/vars/variables.txt | grep location | awk -F'::' '{print $2}')
PREFIX=$(cat ${GELENABLE_DIR}/vars/variables.txt | grep prefix | awk -F'::' '{print $2}')
TENANTID=$(cat ${GELENABLE_DIR}/vars/variables.txt | grep tenant | awk -F'::' '{print $2}')
WORKSHOP_SUBSCRIPTION_ID=$(cat ${GELENABLE_DIR}/vars/variables.txt | grep subscription | awk -F'::' '{print $2}')
SUBSCRPTN=$(cat ${VARS_DIR}/variables.txt | grep subscription | awk -F'::' '{print $2}')
PREFIXNODASH=$(echo $PREFIX | sed 's/-//g')
CLIENTID=$(az ad sp list --filter "displayname eq '${GIT_WKSHP_CODE}_sp'"  --query '[].{appId:appId}' -o tsv)

# Get TAGS
TAGS=$(cat /home/cloud-user/MY_TAGS.txt | sed 's/[,"]//g;s/ = /=/g')

# Get Storage Account
SANAME=$(echo $PREFIX | sed 's/-//g')sa

Create an Azure Container Registry (ACR)


echo -e "\n*** MY AZURE CONTAINER REGISTRY FOR SAS VIYA                = ${PREFIXNODASH}acr" && echo -e "\n*** will be created in RESOURCE GROUP FOR SAS VIYA          = $PREFIX-rg" && echo -e "\n*** in this Azure location                                  = $LOCATION"

# Create an Azure Container Registry
az acr create --resource-group $PREFIX-rg --name ${PREFIXNODASH}acr --sku Basic --location $LOCATION --admin-enabled --tags $TAGS

# Get ACR access token
# not needed now
# az acr login --name ${PREFIXNODASH}acr --expose-token
# docker login loginServer -u 00000000-0000-0000-0000-000000000000 -p accessToken
# Login succeeded message

Attach ACR to Azure Kubernetes Service (AKS)

The AKS needs to be integrated (exchange keys and secrets) with the ACR, to allow publishing from SAS Viya.

az aks update -n $PREFIX-aks -g ${RG} \
--attach-acr /subscriptions/${SUBSCRPTN}/resourceGroups/${RG}/providers/Microsoft.ContainerRegistry/registries/${PREFIXNODASH}acr

This will take two minutes or so…

You will see a long list of update actions taken.

Azure App Registration and Secret

An app was already registered and a secret created for this workshop.

The create steps would be the following > expand to reveal no need to execute, just for your info:

Log in the Azure Portal:

Register a new app:

  • Name: GELENABLE_WORKSHOP_CODE-app e.g. PSGEL267-app
  • From Certificates & secrets > Create a client secret… e.g. PSGEL267-app-secret
  • Expires: Custom
  • Choose the duration:
  • Create.

Copy the secret NOW! You will not be able to see it after you moved away from the screen! The secret would look like: lD07Q~HJ6TcBCwG~MLTwHFCrq0h3_CtVfRt76

You will need two more elements:

  • The Application (client) ID. You can get it from the overview page. The client id will look like: 1b1ee5e0-0308-4efb-84cf-6a10da77f0dd

  • The Object ID. You can get it from the overview page > Managed application in local directory > click on the hyperlink. It is the third element form the top, the Object ID that will look like: d5f75ac5-876d-4f36-b4e9-35fcb02656da

The secret is retrieved from an Azure Key Vault.

Browse the file to see the details:

Assignee_ID=d5f75ac5-876d-4f36-b4e9-35fcb02656da
APP_CLIENT_ID=1b1ee5e0-0308-4efb-84cf-6a10da77f0dd
APP_CLIENT_SECRET=$(az keyvault secret show --name APP-CLIENT-SECRET --vault-name psgel267keyvault --query "value" -o tsv)

echo "Your Azure App Object ID is ${Assignee_ID}"
echo "Your Azure App Client ID is ${APP_CLIENT_ID}"
echo "Your Azure App Secret is... well, it's secret :-)"

Assign the Contributor Role to the App

To get the required permissions to push images to the ACR registry the ‘Contributor’ role is needed for the app registration.

Script

Assign the ‘Contributor’ role for GELENABLE_WORKSHOP_CODE-app to your ACR registry.

# check needed variables
echo $PREFIX && echo $PREFIXNODASH
echo $RG && echo ${PREFIXNODASH}acr
echo $SUBSCRPTN && echo $APP_CLIENT_ID

# assign Contributor role to PSGEL267-app using its Object ID (Assignee_ID)

az role assignment create --assignee-object-id ${Assignee_ID} --assignee-principal-type ServicePrincipal \
--role "Contributor" \
--scope /subscriptions/${SUBSCRPTN}/resourceGroups/${RG}/providers/Microsoft.ContainerRegistry/registries/${PREFIXNODASH}acr

You should see something like:

{
  "canDelegate": null,
  "condition": null,
  "conditionVersion": null,
  "description": null,
  "id": "/subscriptions/e770a687-bc40-4dad-a4c8-8a260676aa24/resourceGroups/sbxbot-r-0158-rg/providers/Microsoft.ContainerRegistry/registries/sbxbotr0158acr/providers/Microsoft.Authorization/roleAssignments/9897e47c-e1d8-434a-9126-ff36931d2321",
  "name": "9897e47c-e1d8-434a-9126-ff36931d2321",
  "principalId": "d5f75ac5-876d-4f36-b4e9-35fcb02656da",
  "principalType": "ServicePrincipal",
  "resourceGroup": "sbxbot-r-0158-rg",
  "roleDefinitionId": "/subscriptions/e770a687-bc40-4dad-a4c8-8a260676aa24/providers/Microsoft.Authorization/roleDefinitions/b24988ac-6180-42a0-ab88-20f7382dd24c",
  "scope": "/subscriptions/e770a687-bc40-4dad-a4c8-8a260676aa24/resourceGroups/sbxbot-r-0158-rg/providers/Microsoft.ContainerRegistry/registries/sbxbotr0158acr",
  "type": "Microsoft.Authorization/roleAssignments"
}

For Info Only: Using the Azure Portal

NO need to execute, for info only.

Manual steps: The process to assign the Contributor role to the app would be the following > expand to reveal:

To assign the Azure app roles to allow access to push and pull images in the Azure Container Registry:

  • Go to your Azure Container Registry > Access control > Role assignments > Add > Search for the $PREFIX-app app and add the Contributor role.

Rather than using the manual steps above, let’s use the Azure CLI to assign the permission.

OPTIONAL: Use the Azure Portal to Confirm the Role Assignment

You can view the role assignment using the Azure management portal:

  • Log in the Azure portal as geldmui@gelenable@sas.com. You can retrieve the password by typing getpw in sasnode01.
  • View the Azure Container Registry resources under the $RG resource group
  • Sort resources by Type ascending
  • Select your container registry ${PREFIXNODASH}acr.
  • On the left blade, choose the Access control (IAM) > Role assignments panel to view the roles for the registry.

Create the Publishing Destination Using the SAS Viya CLI

As of 2022.09, the publishing destination creation has been simplified. You can now use the SAS Viya CLI to create the publishing destination.

Using the SAS Viya CLI is the recommended way!

Attention points:

  • You must use the variables $APP_CLIENT_SECRET and $APP_CLIENT_ID as they are.
  • DO NOT encode them in BASE64!
  • The descriptions must be double quoted.
  • Do not use line separators \ for the SAS Viya CLI statements, as it may fail.

# login with the sas-viya cli profile already created
cd ~
export SAS_CLI_PROFILE=${GELLOW_NAMESPACE:-Default}
export SSL_CERT_FILE=~/.certs/${GELLOW_NAMESPACE}_trustedcerts.pem
sas-viya --profile ${SAS_CLI_PROFILE} auth login --user sasboot --password lnxsas

# Use the CLI
sas-viya --profile ${SAS_CLI_PROFILE} models destination createAzure --help

# repeat variables
NS=$GELENV_NS
INGRESS_FQDN=${RG}.gelenable.sas.com
SASUSER=geldmui@gelenable.sas.com
DOMAIN_NAME="AzureDomain"
ACR_DEST_NAME="Azure"
ACR_DEST_DESC="Azure SAS Container Runtime "
ACR_SERVER=${PREFIXNODASH}acr.azurecr.io
AKS_NAME=$PREFIX-aks

# list variables which will be used
printf "\nThe destination will be created with these parameters \n"
export ACR_DEST_NAME_CLI=AzureCLI
echo "Name: $ACR_DEST_NAME_CLI"
echo $INGRESS_FQDN
echo $PREFIX && echo $PREFIXNODASH
echo ${PREFIXNODASH}acr
echo "baseRepoURL: $ACR_SERVER"
echo "Subscription: $SUBSCRPTN"
echo "Tenant: $TENANTID"
echo "Region: $LOCATION"
echo "Cluster Name: $PREFIX-aks"
echo "Resource group: $RG"
echo "Identity: $SASUSER"
echo "App Client ID: $APP_CLIENT_ID"
echo "App Client Secret: $APP_CLIENT_SECRET"

sas-viya --profile ${SAS_CLI_PROFILE} models destination createAzure --name ${ACR_DEST_NAME_CLI} --description "ACR with SAS Viya CLI" --baseRepoURL ${ACR_SERVER} --subscriptionId ${SUBSCRPTN} --tenantId ${TENANTID} --region ${LOCATION} --kubernetesCluster ${PREFIX}-aks --resourceGroupName ${RG} --credDomainID "ACRCredDomainCLIRomeo" --credDescription "Azure ACR credentials CLI Romeo" --clientId ${APP_CLIENT_ID} --clientSecret ${APP_CLIENT_SECRET} --identityType user --identityId ${SASUSER}

# a second example
# sas-viya --profile ${SAS_CLI_PROFILE} models destination createAzure --name "Test ACR" --description "Test ACR" --baseRepoURL ${PREFIXNODASH}acr.azurecr.io --subscriptionId ${SUBSCRPTN} --tenantId ${TENANTID} --region ${LOCATION} --kubernetesCluster ${PREFIX}-aks --resourceGroupName ${RG} --credDomainID "ACRCredDomainCLIU" --credDescription "Azure ACR credentials CLI U" --clientId ${APP_CLIENT_ID} --clientSecret ${APP_CLIENT_SECRET} --identityType user --identityId ${SASUSER}

Additional Info

If you ever need to delete a publishing destination, uncomment the code below. then execute it:

#sas-viya --profile ${SAS_CLI_PROFILE} models destination delete --name ${ACR_DEST_NAME_CLI}

Log in to SAS Viya

Connect to SAS Viya.

Check the Publishing Destination

Switch to SAS Environment Manager:

  • Add /SASEnvironmentManager/ after the SAS Viya URL or,
  • Choose from the Applications menu, Manage Environment.

Go from the left menu to Publishing Destinations.

Identify the Azure publishing destination. Select and click on Properties.

Azure Publishing Destination

You can now publish decisions to Azure.

Test the Publishing Destination

Go to SAS Intelligent Decisioning.

Publish the decision autoAuctionDec to Azure.

This will take a few minutes. When complete you can check the history:

History

On the Azure Portal, you can check in your resource group, in your container registry, Repositories from the left blade:

Your Azure resources can be retrieved by typing in sasnode01:

PREFIX=$(cat ${GELENABLE_DIR}/vars/variables.txt | grep prefix | awk -F'::' '{print $2}')
PREFIXNODASH=$(echo $PREFIX | sed 's/-//g')
echo ${PREFIXNODASH}acr && echo $RG
Decision in ACR


Other Resources


For more info see Mike’s post: Creating model publishing destinations using the SAS Viya CLI.

Conclusion

That completes the publishing destination creation.


Using SAS Intelligent Decisioning on SAS Viya 4
Lesson 05, Section 0 Exercise: 04 051 Publish Azure 2

Validate a Decision Published to Azure


Complete the Hands-On Exercises

If you haven’t completed the previous hands-on exercises, do it now. This exercise builds on those.

  • 02_21_Rule_Set.
  • 02_31_Lookup_Table.
  • 03_061_Decision.
  • 03_062_Model.
  • 04_051_Publish_Azure_1.

As an alternative, to save time, you can import the solutions mentioned in 04_051_Publish_Azure_1:


Azure Publishing Validation

For click-by-click instructions, watch the demonstration: Demo: Publish Azure 04


Node Port NSG Rule

Publishing validation requires a set of ports opened.

1. Open the Azure Kubernetes Service (AKS) cluster’s network security group to access NodePort services. Add an Inbound Security Rule from the Load Balancer IP of AKS cluster (OUTBOUND IP) to the IP address of the cluster (INBOUND IP) on ports 30000-32767.

Special thanks to Hans-Joachim Edbert for finding the right settings:

With MobaXterm on sasnode01:

# env variables
echo $PREFIX && echo $RG
export KUBECONFIG=~/.kube/config
echo $KUBECONFIG

# find the internal AKS resource group
AKS_RG=$(az aks list -g $RG --query [].nodeResourceGroup -o tsv)
echo "AKS resource group: $AKS_RG"

# find the LB IP address of the AKS cluster (outbound)
# OUTBOUND_IP=$(az network public-ip list --query "[].{address: ipAddress}" -o tsv -g $AKS_RG | cut -f2)

#OUTBOUND_IP=$(az network public-ip list -g $AKS_RG --name "[?dnsSettings.fqdn=='${PREFIX}-rg.gelenable.sas.com'].{tags: tags.type, address: ipAddress}" -o tsv | awk '{print $2}')

# echo "AKS outbound IP: $OUTBOUND_IP"

# find the IP address of the AKS cluster (inbound)
INBOUND_IP=$(az network public-ip list --query "[].{tags: tags.type, address: ipAddress}" -o tsv -g $AKS_RG | grep None | cut -f2)
echo "AKS inbound IP: $INBOUND_IP"

# Get NSG of AKS resource group
AKS_NSG=$(az network nsg list -g $AKS_RG --query [].name -o tsv)
echo "AKS NSG: $AKS_NSG"

### review

# Create inbound nsg rule
az network nsg rule create -g $AKS_RG --nsg-name $AKS_NSG -n AllowNodePortRange \
   --priority 400 \
   --source-address-prefixes $INBOUND_IP \
   --source-port-ranges '*' \
   --destination-address-prefixes $INBOUND_IP \
   --destination-port-ranges '30000-32767' --access Allow \
   --protocol Tcp --description "Allow access to pods via nodeport"

# Rule is created in aks-agentpool-yyyyyyyy-nsg

Add Role to Your App

The authorized operations in Azure, in regard to Azure objects, are called roles. You will create the role-based access control (RBAC).

Assign the Azure app PSGEL267-app roles to access to access the Azure Kubernetes Service validation cluster. Select the AKS instance $RG-aks and click on the left blade Access control (IAM). Add role assignments for Contributor Role.

Assignee_ID=d5f75ac5-876d-4f36-b4e9-35fcb02656da
APP_CLIENT_ID=1b1ee5e0-0308-4efb-84cf-6a10da77f0dd
# check needed variables
echo $PREFIX && echo $PREFIXNODASH
echo $RG && echo ${PREFIXNODASH}acr
echo $SUBSCRPTN && echo $APP_CLIENT_ID && echo ${Assignee_ID}

# assign Contributor role to PSGEL267-app using its Object ID (Assignee_ID)

az role assignment create --assignee-object-id ${Assignee_ID} --assignee-principal-type ServicePrincipal \
--role "Contributor" \
--scope /subscriptions/${SUBSCRPTN}/resourceGroups/${RG}/providers/Microsoft.ContainerService/managedClusters/${PREFIX}-aks

`

Log in to SAS Viya

Connect to SAS Viya via URL with User and Pass.

Validate the Publishing on the Azure Publishing Destination

Go to SAS Intelligent Decisioning:

  • Go to Scoring > Publishing Validation.
  • Select the validation with the destination type Azure.
  • Run It.

When the validation is complete, the log will display something similar to:

You don’t need to wait the completion, you can go to the next exercise and come back later.

Conclusion

Publishing a decision to Azure, creates a container image in an Azure Container Registry.

Validation deploys the container image to an Azure Kubernetes Cluster pod. There, every record is scored through the SAS Container Runtime REST API.

That completes the publishing destination validation.



Using SAS Intelligent Decisioning on SAS Viya 4
Lesson 05, Section 0 Exercise: 04 071 Publish Git 1

Publish a Decision to Git

Objectives: publish a decision or rule set to GitHub. You will require your own GitHub account.

GitLab Note: Because SAS Viya is deployed on Azure, you need to use GitHub. You could also use GitLab.com, but you cannot use the GitLab.sas.com, because of the firewall restrictions to gitlab.sas.com from outside SAS. SAS Viya in this workshop is deployed on Azure and that sits outside sas.com.


Complete the Hands-On Exercises

If you haven’t completed, the previous hands-on exercises, do it now. This exercise builds on those.

  • 02_21_Rule_Set
  • 02_31_Lookup_Table
  • 03_061_Decision
  • 03_062_Model

As an alternative, to save time:

  • Download from solutions: 03_062_Model_carScore_Project and 03_062_Model_myFolder.

  • Create the CAS table mentioned in 02_21_Rule_Set or after importing 03_062_Model_myFolder, go to SAS Studio and run Load_AutoAuctionInput.sas.
  • Import the solutions:
    • 03_062_Model_carScore_Project then
    • 03_062_Model_myFolder.
  • For a reminder on the import click-path, see 99_010_ImportSolution for more details.
  • Go to SAS Intelligent Decisioning:
    • Activate each of the two lookup tables: unwantedMakes and bidCommands.
    • Open autoAuctionDec.
    • Validate and save the decision.
    • Test the decision. Make sure there are no errors.


For click-by-click instructions, watch the demonstration: Demo: ImportSolution 02


What Can You Publish to Git?

You can publish from SAS Intelligent Decisioning to Git rule sets or decisions. Please read for more information.

As the decisions can include SAS models, several model types are allowed. Please see SAS® Intelligent Decisioning: Publishing Decisions for more information.

Publish a Decision to Git Publishing Destination

The process can be broken down into several steps:

  1. Firstly, in SAS Viya, create a Git publishing destination.
    1. Fulfill the configuration prerequisites.
    2. Create the publishing destination.
  2. Secondly, publish the SAS decisions to Git.
  3. Thirdly, deploy the SAS decisions from Git to another SAS Viya environment. You can choose to deploy to:
    1. SAS Micro Analytic Service. You can then score (execute) the SAS decision, using a REST API.
    2. Cloud Analytic Services (CAS). You can run the SAS decision in batch, using CAS actions.

Note: Once you defined a Git publishing destination in SAS Viya, the destination will be shared with SAS Model Manager. Therefore, the steps above apply to SAS models as well.

Configure the Git Repository

To publish from SAS Viya to Git, you will need a Git repository. You could use GitLab, GitHub or any other Git flavor. The following example is using a GitHub repository.

You will require from your GitHub account:

  • GitHub repository.
  • Git repository URL.
  • Git repository branch.
  • GitHub user.
  • GitHub account user email.
  • Personal Access Token (PAT), which acts as a password to access the repository, from SAS.

GitHub Account

If you do not have a GitHub account yet, you need to Sign-Up.

Create a Git Repository

Create a GitHub Repository:

  • Name it sas-viya-az-devops
  • Description: SAS Viya on Azure
  • Make it Private!!! please!
  • Add a README file.

Tip: From a security perspective, SAS Viya can work with a private GitHub repository. Private repositories are only accessible to you, or to people you explicitly share access with.

Watch the following video:

Create a Personal Access Token (PAT)

A Personal Access Token (PAT) acts as a password to access the repository, from SAS Viya.

To create the PAT:

  • Go to https://github.com/settings/tokens/new
  • Log-in if directed.
  • Choose Tokens (classic)
  • Call the PAT sas-viya-az-token.
  • Expiration: 7 days.
  • Select scopes:
    • repo (all)
    • user (all)

  • Generate token.

Watch the following video if you need assistance on how to create a PAT in GitHub:

After you create the PAT, COPY IT. You will not be able to retrieve it later.

  • Paste it in Notepad++. Initialize a variable called GITHUB_PAT. You should have a string of roughly this size:
GITHUB_PAT=ghp_z*jc2y*Z*8bG91Y*P9bSJy8MZ8j*Ne3h*uHu

Save it somewhere safe on your PC in a file pat.txt.

Treat your PAT as you would treat a personal password!

Git Repository URL

Retrieve your Git repository URL:

In the same text file, prepare the following. Change the variables to reflect your context:

GITHUB_REPO_URL=fill_in_here
# e.g., GITHUB_REPO_URL=https://github.com/bteleuca/sas-viya-az-devops.git
GITHUB_REPO=sas-viya-az-devops
#  e.g., GITHUB_REPO=sas-viya-az-devops

Git Repository branch, user and user email

In the same text file, prepare the following. Change the variables to reflect your context (GitHub user, email, etc.):

GITHUB_USER=fill_in_your_user_here
# GITHUB_USER Your GitHub user, typically after https://github.com/ e.g., GITHUB_USER=bteleuca
GITHUB_EMAIL=fill_in_your_email_here
# GITHUB_EMAIL=Your GitHub account registration email
GITHUB_BRANCH=main

Save the txt file somewhere you can retrieve it later.


INFO ONLY: SAS Viya Configuration Prerequisites

The SAS Viya sas-model-publish-pod is responsible for publishing to Git. The pod is a group of one or more containers with shared storage and resources.

This pod requires a Persistent Volume Claim called Git. The Persistent Volume Claim (PVC) is a request for storage, that a pod can consume.

For our course, you can skip this paragraph. Everything was configured at deployment.

We checked the requirements in 00_030_PVCs_Pods_Check.

The video in explains in a few words the required configuration: How to Publish SAS Decisions to Git: Required Configuration.


Using SAS Intelligent Decisioning on SAS Viya 4
Lesson 05, Section 0 Exercise: 04 071 Publish Git 2

Create the Git Publishing Destination

Objectives: create the Git publishing destination in SAS Viya.


Complete the Hands-On Exercises

If you haven’t completed the previous hands-on exercises do it now. This exercise builds on those.

  • 02_21_Rule_Set
  • 02_31_Lookup_Table
  • 03_061_Decision
  • 03_062_Model
  • 04_071_Publish_Git_1

As an alternative, to save time, you can import the solutions mentioned in 04_071_Publish_Git_1:


Steps

  • Initialize needed variables:
    • SAS Viya
    • Git
  • Create the Git Publishing Destination using the SAS Viya CLI.

Gather Workshop Variables

1. Run the following command on the sasnode01 MobaXterm session:


# Get your USER Prefix
getprefix

# AZ LOGIN
azlogin

# Get workshop variables
PREFIX=$(cat ${GELENABLE_DIR}/vars/variables.txt | grep prefix | awk -F'::' '{print $2}')
RG=$(cat ${GELENABLE_DIR}/vars/variables.txt | grep resource-group | awk -F'::' '{print $2}')

source <( cat ${FUNCTION_FILE}  )
source <( cat /opt/gellow_work/vars/vars.txt  )

getcertificates ()

{

source /opt/gellow_work/vars/vars.txt

echo "Getting certificates for" ${GELLOW_NAMESPACE}
mkdir -p ~/.certs
kubectl -n ${GELLOW_NAMESPACE} cp $(kubectl get pod -l app=sas-logon-app -n ${GELLOW_NAMESPACE}  -o=jsonpath='{.items[0].metadata.name}'):security/trustedcerts.pem ~/.certs/${GELLOW_NAMESPACE}_trustedcerts.pem
export SSL_CERT_FILE=~/.certs/${GELLOW_NAMESPACE}_trustedcerts.pem
export REQUESTS_CA_BUNDLE=${SSL_CERT_FILE}

}

getcertificates

export SASUSER=geldmui@gelenable.sas.com
# get user's password
# source script ~/.my_gelenable_functions

function getpw () {

   AZUPW=$(curl -ks https://gelweb.race.sas.com/scripts/gelenable/users/geldmui@gelenable.sas.com.txt)
   echo -e "\n*** Microsoft Azure geldmui@gelenable.sas.com's password = ${AZUPW}"

   echo

}

getpw

echo $AZUPW

echo $RG
INGRESS_FQDN=${RG}.gelenable.sas.com
echo $INGRESS_FQDN

GitHub Variables

Copy the GitHub variables from the txt file you created earlier. Paste them in your terminal:

GITHUB_REPO_URL=fill_in_here
# e.g., GITHUB_REPO_URL=https://github.com/bteleuca/sas-viya-az-devops.git
GITHUB_REPO=sas-viya-az-devops
#  e.g., GITHUB_REPO=sas-viya-az-devops
GITHUB_USER=fill_in_your_user_here
# GITHUB_USER Your GitHub user, typically after https://github.com/ e.g., GITHUB_USER=bteleuca
GITHUB_EMAIL=fill_in_your_email_here
# GITHUB_EMAIL=Your GitHub email
GITHUB_BRANCH=main

Copy the GitHub PAT from the file where you saved it:

GITHUB_PAT=fill_in_here
# e.g., GITHUB_PAT=ghp_z*jc2y*Z*8bG91Y*P9bSJy8MZ8j*Ne3h*uHu

Create the Publishing Destination Using the SAS Viya CLI

As of 2022.09, the publishing destination creation has been simplified. You can now use the SAS Viya CLI to create the publishing destination.

Attention points:

  • You must use the variable $GITHUB_PAT unencoded.
  • DO NOT encode them in BASE64!
  • The descriptions must be double quoted.
  • Do not use line separators \ for the SAS Viya CLI statements, as it may fail.

# login with the SAS Viya CLI profile already created
cd ~
export SAS_CLI_PROFILE=${GELLOW_NAMESPACE:-Default}
export SSL_CERT_FILE=~/.certs/${GELLOW_NAMESPACE}_trustedcerts.pem
sas-viya --profile ${SAS_CLI_PROFILE} auth login --user sasboot --password lnxsas

# Use the CLI
sas-viya --profile ${SAS_CLI_PROFILE} models destination createGit --help

# override variables from the previous destination
DEST_NAME=GitHub
DEST_DESC=GitHubCLI
DOMAIN_NAME=GitHubDomainCLIGamma
DOMAIN_DESC=GitHubDomainCLIGamma
CODE_GEN=MAS

# list variables which will be used
printf "\n We need the following variables for the CLI \n "
echo "SAS Viya URL: https://$INGRESS_FQDN"
echo "SAS User Name: ${SASUSER}"
echo "SAS User Password: ${AZUPW}"
echo "SAS Domain Name: ${DOMAIN_NAME}"
echo "SAS Domain Description: $DOMAIN_DESC"
echo "Git User Name:  $GITHUB_USER"
echo "SAS Publishing Destination Name: $DEST_NAME"
echo "SAS Publishing Destination Description: $DEST_DESC"
echo "Git User Email: $GITHUB_EMAIL"
echo "Git Repository URL: $GITHUB_REPO_URL"
echo "Git Repository Branch:  $GITHUB_BRANCH"
echo "Code Generation Mode MAS or CAS: $CODE_GEN"
#echo "Git Personal Access Token: $GITHUB_PAT"
echo "Git Personal Access Token: well... it's secret..."

Check the variables. Create the publishing destination when every variable seems right:

# create the pub dest - one line and descriptions quoted
sas-viya --profile ${SAS_CLI_PROFILE} models destination createGit --name ${DEST_NAME} --description ${DEST_DESC} --credDomainID ${DOMAIN_NAME} --credDescription ${DOMAIN_DESC} --localRepoLocation " /mmprojectpublic" --identityType user --identityId ${SASUSER} --remoteRepoURL ${GITHUB_REPO_URL}  --gitUserEmail ${GITHUB_EMAIL} --gitUserId ${GITHUB_USER} --gitBranch ${GITHUB_BRANCH} --gitAccessToken ${GITHUB_PAT} --codeGenerationMode ${CODE_GEN}

You shall see something similar to:

{
    "createdBy": "sasboot",
    "creationTimeStamp": "2023-02-08T00:50:57.761901528Z",
    "description": "GitHubCLI",
    "destinationType": "git",
    "domainId": "GitHubDomainCLIGamma",
    "gitRepository": "https://github.com/bteleuca/sas-viya-az-devops.git",
    "id": "2ffce183-16a7-4593-949d-ae2ef94da15f",
    "links": [
        {
            "href": "/modelPublish/destinations",
            "method": "GET",
            "rel": "up",
            "type": "application/vnd.sas.collection",
            "uri": "/modelPublish/destinations"
        },
...

    "modifiedBy": "sasboot",
    "modifiedTimeStamp": "2023-02-08T00:50:57.761901528Z",
    "name": "GitHub",
    "properties": [
        {
            "name": "credDomainId",
            "value": "GitHubDomainCLIGamma"
        },
        {
            "name": "remoteRepositoryURL",
            "value": "https://github.com/user/sas-viya-az-devops.git"
        },
        {
            "name": "localRepositoryLocation",
            "value": " /mmprojectpublic"
        },
        {
            "name": "userEmail",
            "value": "b.t@none.com"
        },
        {
            "name": "codeGenerationMode",
            "value": "MAS"
        },
        {
            "name": "gitBranch",
            "value": "main"
        }
    ],

If you get another message, asking you for variables, check the variable concerned, using echo $VAR, for example, echo ${DOMAIN_NAME}. Make sure it is filled as it should be and retry the cli command again.

Clear the Variables

Clear the environment variables used:

unset GITHUB_USER && unset GITHUB_PAT && unset GITHUB_REPO_URL && unset GITHUB_EMAIL && unset GITHUB_BRANCH && unset CODE_GEN

Additional Info

If you ever need to delete a publishing destination, uncomment the code below:
#sas-viya --profile ${SAS_CLI_PROFILE} models destination delete --name ${DEST_NAME}


Resources

For more info see the following post: Creating Git publishing destinations using the SAS Viya CLI.

Log in to SAS Viya

Connect to SAS Viya via URL with User and Pass.

Check the Publishing Destination

Switch to SAS Environment Manager:

  • Add /SASEnvironmentManager/ after the SAS Viya URL or,
  • Choose from the Applications menu, Manage Environment.

In the Publishing Destination section:

  • Refresh.
  • You will see the Git publishing destination created by the pipeline:

Conclusion

Use the SAS Viya CLI to create a Git publishing destination. As of SAS Viya version 2022.09, this is the recommended way.

You can create any Git publishing destination, using any Git repository. You just have to change the parameters and the variables.



Using SAS Intelligent Decisioning on SAS Viya 4
Lesson 05, Section 0 Exercise: 04 071 Publish Git 3

Publish to the Git Publishing Destination

Objectives: publish decisions, rule sets to the Git publishing destination.


Complete the Hands-On Exercises

If you haven’t completed the previous hands-on exercises, do it now. This exercise builds on those.

  • 02_21_Rule_Set
  • 02_31_Lookup_Table
  • 03_061_Decision
  • 03_062_Model
  • 04_071_Publish_Git_1
  • 04_071_Publish_Git_2

As an alternative, to save time, you can import the solutions mentioned in 04_071_Publish_Git_1:


Log in to SAS Viya

Connect to SAS Viya via URL with User and Pass.

SAS Intelligent Decisioning

Switch to SAS Intelligent Decisioning:

  • add /SASDecisionManager/ after the SAS Viya URL or,
  • choose Build Decisions from the Applications menu.


Publish the Decision to Git

Watch the following video:

Go to Decisions:

1. Select the autoAuctionDec decision you created in the previous exercise.

2. From any tab within the Decision, push the Publish button and follow the prompts.

3. Select the Git (GitHub) publishing destination. Go with the default name autoAuctionDec1_0. Check Replace item with the same name.

4. Publish.


After Publishing

When you publish to Git, a folder and several files are created in the Git repository. The files, their type, depend on the rule set, or the decision objects published.

In your GitHub repository:

  • A Git folder is created.

  • The rule set or the decision logic is described in a DS2 package file.
  • A JSON file details the inputs and the outputs.
  • A second JSON file contains an asset summary.
  • The README.txt in the folder contains versioning information.
  • If a model is used in a decision, many more files, representing the model scoring and the model metadata will also be published.

Feel free to browse the repository and look at each file.

Conclusions

You can publish SAS Intelligent Decisioning rule sets, or decisions, as files in a Git repository.

Every time you publish from SAS, Git creates a new commit. Therefore, everything is tracked and versioned.



Lesson 06

Using SAS Intelligent Decisioning on SAS Viya 4
Lesson 06, Section 0 Exercise: 05 031 Execute CAS

Execute the Decision with PROC CAS

Complete the Hands-On Exercises

If you haven’t completed the previous hands-on exercises, do it now. This exercise builds on those.

  • 02_21_Rule_Set
  • 02_31_Lookup_Table
  • 03_061_Decision
  • 03_062_Model
  • 04_031_Publish_CAS

As an alternative, to save time:

  • Download from solutions: 03_062_Model_carScore_Project and 03_062_Model_myFolder.

  • Create the CAS table mentioned in 02_21_Rule_Set or after importing 03_062_Model_myFolder, go to SAS Studio and run Load_AutoAuctionInput.sas.
  • Import the solutions:
    • 03_062_Model_carScore_Project then
    • 03_062_Model_myFolder.
    • 04_031_Publish_CAS_autoAuctionDec1_0
    • 04_031_Publish_CAS_autoAuctionDec1_0_validation
  • For a reminder on the import click-path, see 99_010_ImportSolution for more details.
  • Go to SAS Intelligent Decisioning:
    • Activate each of the two lookup tables: unwantedMakes and bidCommands.
    • Open autoAuctionDec.
    • Validate and save the decision.
    • Test the decision. Make sure there are no errors.
    • Create the CAS publishing destination.
    • Publish the decision to the CAS publishing destination.

For click-by-click instructions, watch the demonstration: Demo: ImportSolution 02


Prerequisites

  1. SAS Intelligent Decisioning: You must activate the two lookup tables used by the decision before you can run it in CAS.
  2. SAS Environment Manager. Make sure you have the sas_model_table in Public. If you published the previous day, this table will be unloaded. You need to republish the decision to the publishing destination again - complete the 04_031_Publish_CAS hands-on.

Log in to SAS Viya

Connect to SAS Viya via URL with User and Pass.

SAS Studio

Switch to SAS Studio:

  • Add /SASStudio/ after the SAS Viya URL or,
  • Choose from the Applications menu, Develop Code and Flows.

Execute the Decision with PROC CAS

Paste the code into a SAS Studio program:

1. Before the copied SAS Intelligent Decisioning code (This code creates a CAS session, maps all of the caslibs to SAS library names, and begins PROC CAS.)

CAS mySession;
CASLib _all_ assign;

2. Load AUTOAUCTIONINPUT into an in-memory table from the SASHDAT with the same name.

proc casutil incaslib='casuser' outcaslib='casuser';
    droptable casdata='AutoAuctionInput' quiet;
    load casdata='AutoAuctionInput.sashdat' casout='AutoAuctionInput' copies=0 promote;
run;quit;

3. Copy the sessionProp.setFmtSearch and ds2.runModel statements from the CAS publishing validation test code. The code would look like:

Proc cas;
sessionProp.setFmtSearch / fmtLibNames="userformats3";
ds2.runModel /
   modelName="autoAuctionDec1_0"
   table={caslib="CASUSER", name="AUTOAUCTIONINPUT"}
   modelTable={caslib="Public", name="sas_model_table"}
   casOut={caslib="CASUSER",name="testOutput"};
run;
  quit ;

We changed the output table name to “testOutput.”

4. Execute the code. Look at the log.

5. Note that the Decision output table is created and correct. Open it if you wish.

6. Once you have checked, run the following code to terminate your CAS session:

  cas mysession terminate;

For click-by-click instructions, watch the demonstration: Demo: Execute CAS 01


Fix for Missing CAS Table

If you get the following message:

ERROR: The file or path 'sas_model_table' is not available in the file system.
NOTE: Added action set 'ds2'.
ERROR: Table 'sas_model_table' could not be loaded.
ERROR: Failure opening table 'sas_model_table': A table could not be loaded.
ERROR: The action stopped due to errors.

Issue

It means that you published to CAS the previous day. The environment stop / start unloaded the sas_model_table.

Solution

The fix is quick. You need to complete the 04_031_Publish_CAS hands-on.

Then come back to SAS Studio and re-execute the code.


Using SAS Intelligent Decisioning on SAS Viya 4
Lesson 06, Section 0 Exercise: 05 041 Execute SAS Micro Analytic Service

Execute a Published Decision in SAS Micro Analytic Service


Complete the Hands-On Exercises

If you haven’t completed the previous hands-on exercises, do it now. This exercise builds on those.

  • 02_21_Rule_Set
  • 02_31_Lookup_Table
  • 03_061_Decision
  • 03_062_Model
  • 04_041_Publish_SAS_Micro_Analytic_Service

As an alternative, to save time:

  • Download from solutions: 03_062_Model_carScore_Project and 03_062_Model_myFolder.

  • Create the CAS table mentioned in 02_21_Rule_Set or after importing 03_062_Model_myFolder, go to SAS Studio and run Load_AutoAuctionInput.sas.
  • Import the solutions:
    • 03_062_Model_carScore_Project then
    • 03_062_Model_myFolder.
  • For a reminder on the import click-path, see 99_010_ImportSolution for more details.
  • Go to SAS Intelligent Decisioning:
    • Activate each of the two lookup tables: unwantedMakes and bidCommands.
    • Open autoAuctionDec.
    • Validate and save the decision.
    • Test the decision. Make sure there are no errors.
    • Publish the decision to the SAS Micro Analytic Service publishing destination.

For click-by-click instructions, watch the demonstration: Demo: ImportSolution 02


Methods

To use the SAS Viya REST API, you, or your application, will require a valid SAS Viya Access Token.

To obtain an access token, there are several methods{:target=“_blank”} called grants (grant types):

  • Client id and client secret.
  • Authorization code.
  • Password.

The choice has been made for the client id and client secret. Reasons:

  • Azure provides a similar process. Will be discussed in a next hands-on.
  • Appropriate for REST APIs, typically triggered using command line.
  • Does not expose a user’s id and password.
  • Access can be revoked by a SAS administrator, without affecting other users or processes.

Client ID and Client Secret Process

Command line:

  • Get a SAS Viya Bearer token using an admin user and the password.
  • Use the Bearer token to register the new client ID and client secret.
  • Get a SAS Viya access token using the client and the secret.
  • Use the SAS Viya access token to call the decision published to SAS Micro Analytic Service.

Client ID and Client Secret Registrations

The following tasks must be performed by a SAS administrator. For click-by-click instructions, watch the demonstration: Demo: Execute MAS 01

All subsequent operations are done on sasnode01.

1. Connect to sasnode01.

2. On your terminal, run:

cd ~
# Get your USER Prefix
getprefix

# AZ LOGIN
azlogin

# admin pass
echo $AZUPW

3. Initialize Shell Variables. Then set the SAS Viya namespace, the URL, the client name, the secret and the grant type.

# Important variables
RG=$(cat ${GELENABLE_DIR}/vars/variables.txt | grep resource-group | awk -F'::' '{print $2}')
export current_namespace=$GELENV_NS
export INGRESS_FQDN=${RG}.gelenable.sas.com
export INGRESS_URL="https://${INGRESS_FQDN}"
export CLIENT_ID=MASClient
export CLIENT_SECRET=SASGl0bal$
export GRANT_TYPES=client_credentials
echo $INGRESS_FQDN && echo $INGRESS_URL

4. Get Viya Certificates and initialize $CURL_CA_BUNDLE.

We will need this variable when we call the scoring REST API.

# Get Viya Certificates and initialize $CURL_CA_BUNDLE.
# Otherwise you will get an error... curl: (60) Peer's Certificate issuer is not recognized. More details here: # http://curl.haxx.se/docs/sslcerts.html

cd ~
ls -la ~/.certs
# kubectl cp $(kubectl get pod -l app=sas-logon-app  -o=jsonpath='{.items[0].metadata.name}'):security/trustedcerts.pem ~/.certs/${current_namespace}_trustedcerts.pem
export CURL_CA_BUNDLE=~/.certs/${current_namespace}_trustedcerts.pem && echo $CURL_CA_BUNDLE

5. Get a Bearer token.

Aso fo 2023.1 and later, the consul token is no longer needed.

# Request a valid Bearer token to use on the registration call
export BEARER_TOKEN=`curl -sk -X POST \
"${INGRESS_URL}/SASLogon/oauth/token" \
-u "sas.cli:" \
-H "Content-Type: application/x-www-form-urlencoded" \
-d "grant_type=password&username=geldmui@gelenable.sas.com&password=${AZUPW}" | awk -F: '{print $2}'|awk -F\" '{print $2}'`
echo $BEARER_TOKEN

6. Use the Bearer token to register the new client ID and client secret. The client and the secret scope is the SAS Users group (group id sasusers). The grant type is client credential. The token is valid for 3600 seconds, or 1 hour.

echo $INGRESS_URL
echo $CLIENT_ID
echo $CLIENT_SECRET
echo $GRANT_TYPES

# delete the client if already present
curl -sk -X DELETE "${INGRESS_URL}/SASLogon/oauth/clients/${CLIENT_ID}" \
-H "Authorization: Bearer ${BEARER_TOKEN}"

# Register a new client_id and secret
curl -sk -X POST "${INGRESS_URL}/SASLogon/oauth/clients" \
-H "Content-Type: application/json" \
-H "Authorization: Bearer ${BEARER_TOKEN}" \
-d "{\"client_id\": \"${CLIENT_ID}\",
\"client_secret\": \"${CLIENT_SECRET}\",
\"scope\": [\"openid\", \"sasusers\"],
\"authorized_grant_types\": [\"${GRANT_TYPES}\"],
\"redirect_uri\": \"urn:ietf:wg:oauth:2.0:oob\",
\"access_token_validity\": 3600}" | jq

At this point, the client id and secret are registered and can be used from any third-party application to call the SAS Viya REST API. The SAS Administrator has to provide the client id and secret to the user or application which needs to access SAS Viya REST API.

7. Get the clients and confirm your client has been created.

# List the clients created
curl -sk -X GET "${INGRESS_URL}/SASLogon/oauth/clients" \
-H "Content-Type: application/json" \
-H "Authorization: Bearer ${BEARER_TOKEN}" | jq | grep "client_id"

At the top of the list, you will see your client registered.

     "client_id": "MASClient",
    ...

Execute a Decision

With CURL

As a user, you will receive the registered client and secret from the Administrator. With these, you can get an access token and score your decision.

1. Variables were defined above:

cd ~
echo $PREFIX
echo $INGRESS_URL
echo $CURL_CA_BUNDLE
export CLIENT_ID=MASClient
export CLIENT_SECRET=SASGl0bal$
export GRANT_TYPES=client_credentials

2. Get a SAS Viya access token using the client id and secret:

# Get SAS Viya token
ACCESS_TOKEN="$(curl -X POST "${INGRESS_URL}/SASLogon/oauth/token" \
-H "Content-Type: application/x-www-form-urlencoded" \
-d "grant_type=${GRANT_TYPES}&client_id=${CLIENT_ID}&client_secret=${CLIENT_SECRET}" | jq -r '.access_token')"
echo $ACCESS_TOKEN

3. Score the SAS decision published to SAS Micro Analytic Service:

# Execute Decision
DECISION_URL="${INGRESS_URL}:443/microanalyticScore/modules/autoauctiondec1_0/steps/execute"
echo $DECISION_URL

curl -X POST "${DECISION_URL}" \
-H "Authorization: Bearer ${ACCESS_TOKEN}" \
-H "Content-Type: application/json;charset=utf-8" \
-H "Accept: application/json" \
--data '{
"version":1,
"inputs":[
{"name":"BLUEBOOKPRICE_","value":80000},
{"name":"CURRENTBID_","value":90000},
{"name":"MAKE_","value":"Tesla"},
{"name":"MILES_","value":5000},
{"name":"MODEL_","value":"X100D"},
{"name":"ORIGINALINVOICE_","value":100000},
{"name":"ORIGINALMSRP_","value":100000},
{"name":"STATE_","value":"CA"},
{"name":"VIN_","value":"12345678901234562"},
{"name":"YEAR_","value":2017}
]}' | jq

Additional Resources

If you want to know more about authentication:



Using SAS Intelligent Decisioning on SAS Viya 4
Lesson 06, Section 0 Exercise: 05 043 Execute MAS Postman

OPTIONAL: Execute a Published Decision in SAS Micro Analytic Service with Postman

By continuing this exercise, you must create or use your own a Postman account.


Complete the Hands-On Exercises

If you haven’t completed the previous hands-on exercises, do it now. This exercise builds on those.

  • 02_21_Rule_Set
  • 02_31_Lookup_Table
  • 03_061_Decision
  • 03_062_Model
  • 04_041_Publish_SAS_Micro_Analytic_Service

As an alternative, to save time, you can import the solutions mentioned in 05_041_Execute_SAS_Micro_Analytic_Service.


Prerequisites

The previous hands-on 05_041_Execute_SAS_Micro_Analytic_Service must be completed.

Download Postman and Postman Files

For this exercise, you will be using Postman to execute the decision REST API.

For click-by-click instructions, watch the demonstration: Demo: Execute MAS Postman 01

Steps:

1. Download Postman on the workshop Windows client. Install it. The hands-on is using Postman v10.1.2.

2. Install and Open Postman. You will be asked to sign in or create an account.

3. We prepared a collection and some requests, which will call your decision deployed to SAS Micro Analytic Service. Download the Postman files. On sasnode01 you will find those in ~/PSGEL267-using-sas-intelligent-decisioning-on-sas-viya/postman/

4. You must sign-in or create an account if you want to continue.

After sign-in, import the files in Postman.

Postman Calls

5. From Environments > Open SAS_Viya_Azure. Adapt the SAS Viya URL in the environment variables. Replace ${RG} with your resource group. Hint: echo $RG in sasnode01. Save.

6. Go to Collections Obtain_Access_token. Select the environment SAS_Viya_Azure at the top right of the screen.

7. Send to get a SAS Viya access token, using the client and the secret registered by the SAS Administrator.

8. Go to Collections Execute_Decision_MAS. Send to execute the decision published to SAS Micro Analytic Service. This will use the SAS Viya access token obtained previously.

9. In the request body, change the state to CA and Send to execute once more. Watch the new outputs.

Get CA Certificate

To use SSL or TLS, you would require a PEM file.

You can download it by navigating to sasnode01 > SFTP /home/cloud-user/.certs/ folder.

Use SSL

In the previous example, SSL certificate verification was disabled.

Secure Sockets Layer (SSL) is a standard security technology for establishing an encrypted link between a server and a client. In this example, the server is SAS Viya on Azure and the client is Postman.

The data you are sending in the decision request can be quite sensitive and intercepted in transit. Obviously, you need to protect that data.

In this example you will:

1. Enable SSL certificate verification in Postman: Files > Settings > General.

2. Import the certificates in Postman. From Certificates > activate CA Certificates > PEM file > select file.

3. Resend the decision request.

4. Resend the decision request, with state=PA and Make=Scion.

For click-by-click instructions, watch the demonstration: Demo: Execute MAS Postman 02

Optional: Revoke the Client

The client ID and secret has been created by the SAS administrator with a time limit of one hour.

Suppose the client ID and secret have been compromised, the SAS administrator can delete it and cut the access to SAS Viya resources.

To revoke a client:

1. Connect to sasnode01 using MobaXterm or a Terminal application (ssh sasnode01).

# Delete the clients
curl -sk -X DELETE "${INGRESS_URL}/SASLogon/oauth/clients/${CLIENT_ID}" \
    -H "Content-Type: application/json" \
    -H "Authorization: Bearer ${BEARER_TOKEN}"

2. Get the clients and confirm your client has been created.

# List the clients created
curl -sk -X GET "${INGRESS_URL}/SASLogon/oauth/clients" \
    -H "Content-Type: application/json" \
    -H "Authorization: Bearer ${BEARER_TOKEN}" | jq | grep "client_id"

3. The client has now been deleted. Any further attempt to execute a request to obtain an access token using the client or to execute the decision will fail.

# clear the environment
unset current_namespace
unset INGRESS_URL
unset CLIENT_ID
unset CLIENT_SECRET
unset GRANT_TYPES
unset CONSUL_TOKEN
unset OAUTH_TOKEN

Conclusion

You learned how to call a decision published to SAS Micro Analytic Service from outside SAS Viya, using Postman.


Using SAS Intelligent Decisioning on SAS Viya 4
Lesson 06, Section 0 Exercise: 05 051 Execute Azure

Execute the Decision in Azure


Complete the Hands-On Exercises

If you haven’t completed the previous hands-on exercises, do it now. This exercise builds on those.

  • 02_21_Rule_Set
  • 02_31_Lookup_Table
  • 03_061_Decision
  • 03_062_Model
  • 04_051_Publish_Azure_1
  • 04_051_Publish_Azure_2

As an alternative, to save time:

  • Download from solutions: 03_062_Model_carScore_Project and 03_062_Model_myFolder.

  • Create the CAS table mentioned in 02_21_Rule_Set or after importing 03_062_Model_myFolder, go to SAS Studio and run Load_AutoAuctionInput.sas.
  • Import the solutions:
    • 03_062_Model_carScore_Project then
    • 03_062_Model_myFolder.
  • For a reminder on the import click-path, see 99_010_ImportSolution for more details.
  • Go to SAS Intelligent Decisioning:
    • Activate each of the two lookup tables: unwantedMakes and bidCommands.
    • Open autoAuctionDec.
    • Validate and save the decision.
    • Test the decision. Make sure there are no errors.
  • There is no importable alternative for 04_051_Publish_Azure_1](/04_Publishing/04_051_Publish_Azure_1.md) nor for [04_051_Publish_Azure_2. You must complete these hands-on.

For click-by-click instructions, watch the demonstration: Demo: ImportSolution 02


Execute (Score) the Decision in Azure

To score a decision published as a container image, you need a running container, web app or Kubernetes pod.

In this example, you will deploy the decision as an Azure Container Instance (ACI). This is the simplest way to deploy a container image.

Azure Container Instance Variables

$MYUSER is your 6 letters SAS user id.

Component Value
Resource group ACR $PREFIX-rg
Container name autoauctiondec
Region US (East US)
Image source Azure Container Registry
Registry ${PREFIXNODASH}acr
Image autoauctiondec1_0
Image tag latest
OS type Linux
Size choose default
Networking type Public
DNS name label ${PREFIXNODASH}-autoauctiondec
DNS name label scope reuse No reuse
Port 8080
Port protocol TCP
Tags name $TAGS

Create an Azure Container Instance

For click-by-click instructions, watch the demonstration: Demo: Execute Az 01

Script

# Get your USER Prefix
getprefix

# AZ LOGIN
azlogin

# Get workshop variables
PREFIX=$(cat ${GELENABLE_DIR}/vars/variables.txt | grep prefix | awk -F'::' '{print $2}')
PREFIXNODASH=$(echo $PREFIX | sed 's/-//g')
RG=$(cat ${GELENABLE_DIR}/vars/variables.txt | grep resource-group | awk -F'::' '{print $2}')
LOCATION=$(cat ${GELENABLE_DIR}/vars/variables.txt | grep location | awk -F'::' '{print $2}')

# Get TAGS
TAGS=$(cat /home/cloud-user/MY_TAGS.txt | sed 's/[,"]//g;s/ = /=/g')

CONT=autoauctiondec
IMAGE=autoauctiondec1_0
IMAGE_TAG=latest
ACR_PASS=$(az acr credential show -n ${PREFIXNODASH}acr --query "passwords[0].value"  -o tsv)

printf "\n We need the following variables \n "
echo "Container name: $CONT"
echo "Resource group: $RG"
echo "Azure Container Registry: ${PREFIXNODASH}acr"
echo "SAS Container Runtime container image: $IMAGE and tag: $TAG"
echo "Azure Container Registry credential: ${ACR_PASS}"
echo "DNS name: $PREFIXNODASH-$CONT"
echo "Tags: $TAGS"

# Create the ACI

az container create -n $CONT -g $RG \
            --image "${PREFIXNODASH}acr.azurecr.io/$IMAGE:$IMAGE_TAG" \
            --registry-username ${PREFIXNODASH}acr \
            --registry-password "$ACR_PASS" \
            --ports 80 8080 \
            --protocol TCP \
            --dns-name-label $PREFIXNODASH-$CONT \
            --location $LOCATION

INFO ONLY: Using the Interface

Obviously, you can also use the Azure Portal interface to create a container. More info in How to Score a SAS Decision Published to Azure with SAS Container Runtime. ## Container Instance

Your container instance should be up and running in a minute or so.

1. Go to your Azure Portal - Container Instances.

2. Identify your container instance. Copy the FQDN generated from the interface. It should be:

FQDN=$PREFIXNODASH-$CONT.$LOCATION.azurecontainer.io
echo $FQDN

3. The container created has the following characteristics.

4. Click in the Properties to consult the ports on which it is listening.

Score the Running Container using curl

For click-by-click instructions, watch the demonstration: Demo: Execute Az 02

From your Azure portal:

  • Go into your resource group ${RG}
  • Open the container instance autoauctiondec
  • From top right, open a Cloud Shell instance:

  • The first time you will be asked for a storage account.
  • Choose PSGEL267 Using SAS Intelligent Decisioning with SAS Viya subscription.
  • Show advanced settings.
  • Resource group: use existing. Choose your specific $PREFIX-azuredm-data e.g. sbxbot-r0089-azuredm-data.
  • Storage account: use existing.
  • Choose ${PREFIXNODASH}sa storage account e.g. sbxbotr0089sa
  • In File share: use existing and type fsdata
  • Attach to storage.

  • Copy the FQDN from the container instance.

  • In your Cloud Shell terminal, write FQDN= and paste the FQDN value copied from sasnode01.

  • Score the decision deployed to the ACI through the SAS Container Runtime REST API. A request is sent at the FQDN plus the port 8080 plus the decision endpoint.
echo $FQDN
curl --location --request POST "http://${FQDN}:8080/autoauctiondec1_0"  --header 'Content-Type: application/json'  --header 'Accept: application/json'  --data '{
"version":1,
"inputs":[
{"name":"BlueBookPrice","value":80000},
{"name":"CurrentBid","value":90000},
{"name":"Make","value":"Tesla"},
{"name":"Miles","value":5000},
{"name":"Model","value":"X100D"},
{"name":"OriginalInvoice","value":100000},
{"name":"OriginalMSRP","value":100000},
{"name":"state","value":"CA"},
{"name":"VIN","value":"12345678901234562"},
{"name":"Year","value":2017}
]
}' | jq

You shall see:

Scroll up:

Re-Score

You can try a second example, which will generate a negative Bid decision:

curl --location --request POST "http://${FQDN}:8080/autoauctiondec1_0"  --header 'Content-Type: application/json'  --header 'Accept: application/json'  --data '{
"version":1,
"inputs":[
{"name":"BlueBookPrice","value":10000},
{"name":"CurrentBid","value":9000},
{"name":"Make","value":"Scion"},
{"name":"Miles","value":20000},
{"name":"Model","value":"TC"},
{"name":"OriginalInvoice","value":18000},
{"name":"OriginalMSRP","value":19500},
{"name":"state","value":"PA"},
{"name":"VIN","value":"12345678901234566"},
{"name":"Year","value":2016}
]
}' | jq

You shall receive back the scoring result:

Optional: Score the Running Container using Python

On your Cloud Shell terminal, execute the following:

1. Create a Python program to score the decision:

# FQDN check
echo $FQDN

# Python Scoring File
tee  score.py > /dev/null <<EOF

import http.client
import json
aci_dns= '${FQDN}:8080'
print ("Your ACI DNS is: ", aci_dns)
print()

conn = http.client.HTTPConnection(aci_dns)
payload = json.dumps({
    "inputs":[
    {"name":"BlueBookPrice","value":80000},
    {"name":"CurrentBid","value":90000},
    {"name":"Make","value":"Tesla"},
    {"name":"Miles","value":5000},
    {"name":"Model","value":"X100D"},
    {"name":"OriginalInvoice","value":100000},
    {"name":"OriginalMSRP","value":100000},
    {"name":"state","value":"CA"},
    {"name":"VIN","value":"12345678901234562"},
    {"name":"Year","value":2017}
    ]
    })
headers = {
  'Accept': 'application/json',
  'Content-Type': 'application/json'
}
conn.request("POST", "/autoauctiondec1_0", payload, headers)
res = conn.getresponse()
data = res.read()

print ("Your Scoring Result is: ", data.decode("utf-8"))
print()

EOF

cat score.py

3. Call the scoring program:

# Score Decision using Python
python3 score.py

You should see:

4. Exit the Cloud Shell. x at top right.


Conclusion

You have just deployed a decision in Azure and scored it using bash scripts or from a Python program.


Using SAS Intelligent Decisioning on SAS Viya 4
Lesson 06, Section 0 Exercise: 05 056 Python App on Azure

Score a Decision in Azure with a Python Flask Web App

You will learn how to create a front web app using Python. The web app will call the back end container instance where the decision is running.

Complete the Hands-On Exercises

If you haven’t completed the previous hands-on exercises, do it now. This exercise builds on those.

  • 02_21_Rule_Set
  • 02_31_Lookup_Table
  • 03_061_Decision
  • 03_062_Model
  • 04_051_Publish_Azure_1
  • 04_051_Publish_Azure_2
  • 05_051_Execute_Azure

As an alternative, to save time:

  • Download from solutions: 03_062_Model_carScore_Project and 03_062_Model_myFolder.

  • Create the CAS table mentioned in 02_21_Rule_Set or after importing 03_062_Model_myFolder, go to SAS Studio and run Load_AutoAuctionInput.sas.
  • Import the solutions:
    • 03_062_Model_carScore_Project then
    • 03_062_Model_myFolder.
  • For a reminder on the import click-path, see 99_010_ImportSolution for more details.
  • Go to SAS Intelligent Decisioning:
    • Activate each of the two lookup tables: unwantedMakes and bidCommands.
    • Open autoAuctionDec.
    • Validate and save the decision.
    • Test the decision. Make sure there are no errors.
  • There is no importable alternative for 04_051_Publish_Azure_1](/04_Publishing/04_051_Publish_Azure_1.md) nor for 04_051_Publish_Azure_2, [05_051_Execute_Azure. You must complete these hands-on.

For click-by-click instructions, watch the demonstration: Demo: ImportSolution 02


Deploy an Azure Web App

For click-by-click instructions, watch the demonstration: Demo: Execute Az 03


Get variables

On sasnode01:

# Get your USER Prefix
getprefix

# AZ LOGIN
azlogin

# Get workshop variables
RG=$(cat ${GELENABLE_DIR}/vars/variables.txt | grep resource-group | awk -F'::' '{print $2}')
LOCATION=$(cat ${GELENABLE_DIR}/vars/variables.txt | grep location | awk -F'::' '{print $2}')
PREFIX=$(cat ${GELENABLE_DIR}/vars/variables.txt | grep prefix | awk -F'::' '{print $2}')
PREFIXNODASH=$(echo $PREFIX | sed 's/-//g')

# Get TAGS
TAGS=$(cat /home/cloud-user/MY_TAGS.txt | sed 's/[,"]//g;s/ = /=/g')

# Get Storage Account
SANAME=$(echo $PREFIX | sed 's/-//g')sa

Create an Azure Container Registry Azure Container Runtime and Activate Admin

Done in 04_051_Publish_Azure_1.

List SAS Container Runtime Container Images in Azure Container Runtime

# list all images in our Azure Container Runtime
az acr repository list -n ${PREFIXNODASH}acr -o table

# show the tags for the created repository
az acr repository show-tags -n ${PREFIXNODASH}acr  --repository autoauctiondec1_0  -o table

Create an Azure Container Instance

Done in 05_051_Execute_Azure.

Get the Container Instance FQDN and IP

CONT=autoauctiondec
export FQDN=$(az container show -g $RG -n $CONT | jq -r .ipAddress.fqdn)
echo $FQDN
export IP=$(az container show -g $RG -n $CONT | jq -r .ipAddress.ip)
echo $IP

# Get the SAS Container Runtime container endpoint / known
export ENDPOINT=autoauctiondec1_0
echo $ENDPOINT

# These two parameters must be passed as arguments into the front web app app.py script

Web App Files

The Python Flask Web App files are stored in the workshop repository.

The main file is the app.py. In this file, a form is defined. When the form is filled and a button pressed, the parameters are sent to the Azure Container Instance where the decision is deployed using a REST API.

There a few other files and folders for the formatting and rendering, e.g. /static or /templates/.

We now want to get the IP of the running container instance and insert it in the app.py.

In real life, you should use the Fully Qualified Domain Name (FQDN), not the IP. As it takes a while to have the FQDN added to an Azure DNS, the IP is faster for our workshop.

cd $ROOTDIR
git pull
cd scr-python-flask-webapp
ls

# Initialize the app.py with the FQDN

sed -i 's/{{FQDN}}/'"${FQDN}"'/' ./app.py
sed -i 's/{{ENDPOINT}}/'"${ENDPOINT}"'/' ./app.py
sed -i 's/{{IP}}/'"${IP}"'/' ./app.py

cat app.py

FQDN should be replaced with the FQDN of your ACI. The endpoint should be autoauctiondec1_0

Deploy the Flask Web App

Deployment is made incredibly easy by Microsoft. You can use only one command: az webapp up

echo $RG && echo $LOCATION && echo ${PREFIXNODASH}-scrwebapp
rm -rf *.png
rm -rf __pycache__
rm -rf *.sh
rm -rf score.py
rm -rf *.md
ls -la

# Deploy your Flask APP
az webapp up --runtime PYTHON:3.9 --sku P1V2 --logs --name ${PREFIXNODASH}-scrwebapp --resource-group $RG --location $LOCATION --verbose

Web App Deployment Starts

You should see:

The webapp 'scrwebapp' doesn't exist
Found sku argument, skipping use default sku
Creating AppServicePlan '5ea9538e-8087-4eb7-b6c0-c923ba21fcbb_asp_9824' ...
Creating webapp 'scrwebapp' ...
will set appsetting for enabling build
Configuring default logging for the app, if not already enabled
Creating zip with contents of dir /home/cloud-user/PSGEL267-using-sas-intelligent-decisioning-on-sas-viya/scr-python-flask-webapp ...
Getting scm site credentials for zip deployment
Starting zip deployment. This operation can take a while to complete ...
Deployment endpoint responded with status code 202
Fetching changes.
Fetching changes.
Updating submodules.
Running oryx build...
...
Running oryx build...
Parsing the build logs
Fetching changes.
Fetching changes.

Deployment Complete

After a few minutes you should see your app is available for scoring via a link

You can launch the app at http://scrwebapp.azurewebsites.net
Configuring default logging for the app, if not already enabled
2022-11-11T07:21:51  Welcome, you are now connected to log-streaming service.

Open the Web App

Navigate to the link provided: https://${PREFIX}-scrwebapp.azurewebsites.net

It might take a while, up to five minutes to load the app only the first time… A few minutes or so until all the resources are loaded, DNS created…

But when it’s loaded, that is what you will see:

Voila!

Type in the following:

  • Make: Tesla
  • Model: 3
  • CurrentBid: 90000
  • Miles: 5000
  • State: CA
  • Year: 2017
  • BlueBookPrice: 80000
  • OriginalInvoice: 100000
  • OriginalMSRP: 100000
  • VIN: 12345678901234562

Press Score with SCR:

Change a few parameters and score again.

Remove the web app and the container when you are done:

You might need to press CTRL + C in the console.

az webapp delete --name ${PREFIXNODASH}-scrwebapp --resource-group $RG

# Delete the Container Instance

az container delete --name $CONT --resource-group $RG
# 'y' when prompted


Conclusion

You have now deployed a front web app for your back-end container instance - all running in Azure. And inside the container, you have a SAS decision running, scoring a model, calling DS2 files, calling rule sets and lookup tables.

Good job!

End


Using SAS Intelligent Decisioning on SAS Viya 4
Lesson 06, Section 0 Exercise: 05 071 Deploy Git MAS

Deploy a Published Decision from Git to SAS Micro Analytic Service

SKIP the Hands-On!

The SAS Viya CLI cannot install the decisiongitdeploy plug-in on the training Linux machines. This plug-in is required for the exercises.

Skip this hands-on. Feel free to read, but you won’t be able to execute it.

Objective

You will learn how to deploy a decision published to Git to SAS Micro Analytic Service.


Complete the Hands-On Exercises

If you haven’t completed, the previous hands-on exercises, do it now. This exercise builds on those.

  • 02_21_Rule_Set
  • 02_31_Lookup_Table
  • 03_061_Decision
  • 03_062_Model
  • 04_071_Publish_Git_1
  • 04_071_Publish_Git_2
  • 04_071_Publish_Git_3

As an alternative, to save time:

  • Download from solutions: 03_062_Model_carScore_Project and 03_062_Model_myFolder.

  • Create the CAS table mentioned in 02_21_Rule_Set or after importing 03_062_Model_myFolder, go to SAS Studio and run Load_AutoAuctionInput.sas.
  • Import the solutions:
    • 03_062_Model_carScore_Project then
    • 03_062_Model_myFolder.
  • For a reminder on the import click-path, see 99_010_ImportSolution for more details.
  • Go to SAS Intelligent Decisioning:
    • Activate each of the two lookup tables: unwantedMakes and bidCommands.
    • Open autoAuctionDec.
    • Validate and save the decision.
    • Test the decision. Make sure there are no errors.
  • There is no importable alternative for: 04_071_Publish_Git_1](/04_Publishing/04_071_Publish_Git_1.md), 04_071_Publish_Git_2, [04_071_Publish_Git_3. You must complete these hands-on.


For click-by-click instructions, watch the demonstration: Demo: ImportSolution 02


Deploy From Git

The deployment process:

  • Make accessible the Git files containing the published decisions to the SAS Viya Command Line Interface (CLI).
  • Select the publishing destination.
  • Use the decisiongitdeploy SAS Viya CLI to deploy these decisions, to the SAS Viya publishing destination.

For click-by-click instructions, watch the demonstration: Demo: Deploy Git MAS 01

Make Accessible the Git Files to the SAS Viya CLI

To access the files from your Git repository, an easy option is to clone the Git repository.

All subsequent operations are done on sasnode01.

1. Connect to sasnode01 with your SAS userid and domain password, either using MobaXterm or a Terminal application (ssh sasnode01).

2. Check your user. On your sasnode01 terminal, run:

cd ~
pwd
# env variables
azlogin
getprefix
export KUBECONFIG=~/.kube/config
echo $KUBECONFIG

3. Copy the GitHub variables from you text file, called vars.txt and paste them in your terminal:

GITHUB_REPO_URL=<fill-in-here>
# e.g., GITHUB_REPO_URL=https://github.com/bteleuca/sas-viya-az-devops.git
GITHUB_REPO=sas-viya-az-devops
#  e.g., GITHUB_REPO=sas-viya-az-devops
GITHUB_USER=<fill-in-your-user-here>
# GITHUB_USER Your GitHub user, typically after https://github.com/ e.g., GITHUB_USER=bteleuca
GITHUB_EMAIL=<fill-in-your-email-here>
# GITHUB_EMAIL=Your GitHub email
GITHUB_BRANCH=main

4. Copy the GitHub PAT from the file where you saved it pat.txt:

GITHUB_PAT=<fill-in-here>
# e.g., GITHUB_PAT=ghp_z*jc2y*Z*8bG91Y*P9bSJy8MZ8j*Ne3h*uHu

5. Clone the repository

echo 'Delete folder if it exists'
rm -rf ~/sas-viya-az-devops
echo Pull private repo using PAT
git clone "https://${GITHUB_USER}:${GITHUB_PAT}@github.com/${GITHUB_USER}/${GITHUB_REPO}.git" -b ${GITHUB_BRANCH}
ls sas-viya-az-devops

You should see:

autoAuctionDec1_0  README.md

Select the Publishing Destination

A SAS Micro Analytic Service must exist prior to publishing. In most SAS Viya deployments, a SAS Micro Analytic Service destination is created by default, just like in this deployment.

Use the decisiongitdeploy SAS Viya CLI

To deploy the decisions from the cloned files, you must use the SAS Viya Command Line Interface (CLI).

Deploy to SAS Micro Analytic Service

You must use the decisiongitdeploy plug-in to deploy the decision.

When you call the deploy command, you must be in the folder where the Git repository, containing the decision files, was cloned.

In the example below, the gitDeploy folder contains subfolders, one for each published rule set or decision. The subfolder name is passed in the decisiongitdeploy command.

The deployment code looks like:


# login with the sas-viya cli profile already created
cd ~
export SAS_CLI_PROFILE=${GELLOW_NAMESPACE:-Default}
export SSL_CERT_FILE=~/.certs/${GELLOW_NAMESPACE}_trustedcerts.pem
sas-viya -k auth login --user sasboot --password lnxsas

# Deploy from GIT to SAS Micro Analytic Service

# cd to the folder where the repository containing published was cloned
cd ~/sas-viya-az-devops
pwd && ls -la

# Deploy a decision from Git
/opt/sas/viya/home/bin/sas-viya -k --profile ${SAS_CLI_PROFILE} decisiongitdeploy deploy autoAuctionDec1_0 --force=true

# Notes
## --destinationtype=MAS parameter is optional, as MAS is the default deployment destination
## --force=true, allows to overwrite a MAS module if it already exists; results in an error if the module exists and the parameter is not specified

You should see:

Deploying module autoAuctionDec1_0 to MAS.
Checking for module autoAuctionDec1_0 pre-existance.
...
Deleting module autoAuctionDec1_0.
...
New module "autoAuctionDec1_0" was created at "https://sbxbot-gel.eastus.cloudapp.azure.com/microanalyticScore/modules/autoauctiondec1_0".

Lastly

# List destinations and published SAS objects
sas-viya -k --profile ${SAS_CLI_PROFILE} --output text models destination list
echo Show Destination Information
sas-viya -k --profile ${SAS_CLI_PROFILE} --output text models destination show -n "maslocal"
echo Get a List of Published Objects
sas-viya -k --profile ${SAS_CLI_PROFILE} --output text models published list

Conclusions

To deploy rule sets or decisions, from a Git repository, to a SAS Micro Analytic Service publishing destination:

  • First, make the Git files accessible to the SAS Viya Command Line Interface (CLI).
  • Second, select the existing publishing destination.
  • Third, use the SAS Viya CLI decisiongitdeploy plug-in, to deploy to the chosen publishing destination.



Using SAS Intelligent Decisioning on SAS Viya 4
Lesson 06, Section 0 Exercise: 05 076 Deploy Git CAS

Deploy a Published Decision from Git to CAS

SKIP the Hands-On!

The SAS Viya CLI cannot install the decisiongitdeploy plug-in on the training Linux machines. This plug-in is required for the exercises.

Skip this hands-on. Feel free to read, but you won’t be able to execute it.


Complete the Hands-On Exercises

If you haven’t completed the previous hands-on exercise 05_071_Deploy_Git_MAS, complete it now.

This exercise builds on other hands-on as well.

  • 02_21_Rule_Set
  • 02_31_Lookup_Table
  • 03_061_Decision
  • 03_062_Model
  • 04_071_Publish_Git_1
  • 04_071_Publish_Git_2
  • 04_071_Publish_Git_3
  • 05_071_Deploy_Git_MAS

As an alternative, to save time, you can import the solutions mentioned in 05_071_Deploy_Git_MAS:

For click-by-click instructions, watch the demonstration: Demo: ImportSolution 02


Prerequisites

  • CAS destination must exist: 04_031_Publish_CAS.
  • Git publishing destination must exist.
  • This hands-on exercises continues the work in: 05_071_Deploy_Git_MAS.

Deploy to CAS

Instead of SAS Micro Analytic Service, you can also deploy to CAS. Obviously, a CAS publishing destination must exist.

For click-by-click instructions, watch the demonstration: Demo: Deploy Git CAS 01

You must use the decisiongitdeploy plug-in to deploy the decision.

When you call the deploy command, you must be in the folder where the Git repository, containing the decision files, was cloned.

In the example below, the gitDeploy folder contains subfolders, one for each published rule set or decision. The subfolder name is passed in the decisiongitdeploy command.

The deployment code looks like:

# Deploy from GIT

# cd to the folder where the repository containing published was cloned
cd ~/sas-viya-az-devops
ls

# Deploy a decision from Git
sas-viya -k --profile ${SAS_CLI_PROFILE} decisiongitdeploy deploy autoAuctionDec1_0 --force=true --destinationtype CAS --server cas-shared-default --libname Public --tablename sas_model_table

# Notes
## --destinationtype=CAS parameter is needed, as SAS Micro Analytic Service is the default deployment destination
## --force=true, allows to overwrite a CAS table, if it already exists
## must specify the target CAS server, caslib and table name. They all must exist before deployment.

You should see:

Deploying module autoAuctionDec1_0 to CAS.
Module autoAuctionDec1_0 successfully deployed to CAS.

In SAS Environment Manager > Data section you will see the new date of the sas_model_table. This date tells you that it has been refreshed by your deployment:

The publish forces the overwrite of the decision publish to CAS.

The decision has now been deployed from Git to CAS.

Clean-Up

Remove the GITHUB_PAT from your environmnet variables!!!

unset GITHUB_PAT && unset GITFOLDER && unset GITHUB_REPO_URL && unset GITHUB_REPO && unset GITHUB_USER && unset GITHUB_EMAIL && unset GITHUB_BRANCH

rm -rf ~/sas-viya-az-devops
ls

Conclusions

To deploy rule sets or decisions, from a Git repository, to a Cloud Analytic Service (CAS) publishing destination:

  • First, make the Git files accessible to the SAS Viya Command Line Interface (CLI).
  • Second, select the existing publishing destination.
  • Third, use the SAS Viya CLI decisiongitdeploy plug-in, to deploy to the chosen publishing destination.



Lesson 07

Using SAS Intelligent Decisioning on SAS Viya 4
Lesson 07, Section 0 Exercise: 06 031 Data Grid

Create a Rule Set with Data Grids


Build the Test Data

1. Run the following code in SAS Studio to build an input table to support the development of the data grids and rule set.

data customerBase ;
length customerID 8 firstName $20 lastName $20 age 8;
customerID=1; firstName="John"; lastName="Frizzel"; age=52; output;
customerID=2; firstName="Dianne"; lastName="West"; age=36; output;
customerID=3; firstName="Fred"; lastName="Goodman"; age=28; output;
customerID=4; firstName="Janet"; lastName="Planet"; age=42; output;
run;

data customerAddress;
length customerID 8 address1 $20 address2 $20 city $20 state $2 estimatedPrice 8;
customerID=1; address1="42 Red Storm St"; address2=""; city="Lancaster"; state="PA"; estimatedPrice=300000; output;
customerID=1; address1="24 Coastal Ave"; address2="21"; city="Beach Town"; state="NJ"; estimatedPrice=400000; output;
customerID=2; address1="400 Wood St"; address2=""; city="Erie"; state="PA"; estimatedPrice=200000; output;
customerID=2; address1="25 Lake Rd"; address2="A"; city="Erie"; state="PA"; estimatedPrice=100000; output;
customerID=2; address1="25 Lake Rd"; address2="B"; city="Erie"; state="PA"; estimatedPrice=100000; output;
customerID=2; address1="25 Lake Rd"; address2="C"; city="Erie"; state="PA"; estimatedPrice=100000; output;
customerID=3; address1="10 Jackson St"; address2=""; city="Lebanon"; state="PA"; estimatedPrice=150000; output;
customerID=4; address1="52 Oakton Blvd"; address2=""; city="Reading"; state="PA"; estimatedPrice=300000; output;
run;

data customerPhone;
length customerID 8 phoneNumber $20 type $20 primary 8;
customerID=1; phoneNumber="555-4172"; type="Home"; primary=0; output;
customerID=1; phoneNumber="555-3209"; type="Mobile"; primary=1; output;
customerID=1; phoneNumber="555-4304"; type="Work"; primary=0; output;
customerID=2; phoneNumber="555-3001"; type="Home"; primary=1; output;
customerID=2; phoneNumber="555-4329"; type="Mobile"; primary=0; output;
customerID=2; phoneNumber="555-8904"; type="Work"; primary=0; output;
customerID=3; phoneNumber="555-6789"; type="Work"; primary=1; output;
customerID=4; phoneNumber="555-1290"; type="Mobile"; primary=0; output;
customerID=4; phoneNumber="555-2109"; type="Work"; primary=1; output;
run;

2. Examine the log and ensure that the output SAS tables were created. You can easily see the structure in the above code. Examine the tables. All tables are created in the WORK library:

  • A base customer table.

  • A customer address table with multiple addresses for each customer.

  • A customer phone number table with multiple phone numbers for each customer.

For click-by-click instructions, watch the demonstration: Demo: DataGrids 02


Serialize the Data

1. Run the following code in SAS Studio to serialize the customer address and customer phone tables.

Note that the dcm_serializeGrid auto-call macro is delivered with SAS Intelligent Decisioning.

%dcm_serializeGrid(
      gridSourceTable=customerAddress,
      gridColName=address,
      outputTable=addressGrid,
      classvars=customerID);

%dcm_serializeGrid(
      gridSourceTable=customerPhone,
      gridColName=phone,
      outputTable=phoneGrid,
      classvars=customerID);

2. Examine the log and ensure that the output SAS tables, AddressGrid and PhoneGrid were created. Examine the tables. They are the serialized versions of the CustomerAddress and CustomerPhone tables, respectively. Both tables are created in the WORK library.

For click-by-click instructions, watch the demonstration: Demo: DataGrids 02


Join the Serialized Tables

1. Run the following code in SAS Studio to join the base Customer table with the serialized customer address AddressGrid and customer phone PhoneGrid tables.

Note that the dcm_mergeSerializedGrids auto-call macro is delivered with SAS Intelligent Decisioning.

%dcm_mergeSerializedGrids(
      mergeTable=customerBase,
      mergekey=customerID,
      outputTable=customerGrid,
      gridTables=addressGrid phoneGrid,
      gridMergeKeys=customerID customerID,
      gridColumns=address phone);

2. Examine the log and ensure that the output SAS table CustomerGrid was created. Examine the table. See that it contains the base Customer information as well as the serialized phone and address information.

For click-by-click instructions, watch the demonstration: Demo: DataGrids 03


Load the Serialized and Joined Table to CAS

1. Load the customerGrid table to CAS.

cas mySession sessopts=(metrics=true);
caslib _all_ assign;
proc casutil;
   load data=work.customerGrid casout="customerGrid" outcaslib="CASUSER" promote;
run;
quit;

2. Examine the log and ensure that the customerGrid table exists in the CASUSER caslib.

For click-by-click instructions, watch the demonstration: Demo: DataGrids 04

Terminate Your CAS Session

After you confirmed the CAS table was created, please run the following code in SAS Studio to terminate your CAS session.

cas mySession terminate;


Create a Customer Campaign Rule Set

In SAS Intelligent Decisioning:

1. Create a Rule Set with the following characteristics:

  • Name: customerCampaign

  • Location: /MyFolder

Variables

Variable Name Data Type Input Output Notes
address Data Grid x x Import from customerGrid
age Decimal x x Import from customerGrid
customerID Decimal x x Import from customerGrid
firstName Character x x Import from customerGrid
lastName Character x x Import from customerGrid
phone Data Grid x x Import from customerGrid
realEstateValue Decimal x Custom
propertyCount Decimal x Custom
makeOffer Boolean x Custom

Change the data type for the address and phone columns to Data Grid.

For click-by-click instructions, watch the following demonstrations: Demo: DataGrids 05 and Demo: DataGrids 06

Assignments

1. realEstateValue = DATAGRID_SUM(address,'ESTIMATEDPRICE')

    Note:  EstimatedPrice is a field inside the data grid.  The function sums the values in the nested rows.

2. propertyCount = DATAGRID_COUNT(address)

    Note:  The function counts the nested rows.

For click-by-click instructions, watch the following demonstration: Demo: DataGrids 07

Rule

Name:  offerConditions

Logic:  If propertyCount > 1 AND realEstateValue > 300000 then makeOffer = TRUE

For click-by-click instructions, watch the following demonstration: Demo: DataGrids 08


Test the Rule Set

1. Test the rule set and validate the rules are being applied correctly.

2. For click-by-click instructions, watch the following demonstration: Demo: DataGrids 09


Publish the Rule Set

1. Save the rule set and publish it to the SAS Micro Analytic Service.

For click-by-click instructions, watch the following demonstration: Demo: DataGrids 10

2. Run the published rule set via the publishing validation mechanism in Intelligent Decisioning.

For click-by-click instructions, watch the following demonstration: Demo: DataGrids 11

Update: In the current LTS the publish screen may look slightly different.

Examine the test code and note the following:

1. The web service end point that executes the customerCampaign rule set is:

`https://sas-microanalytic-score/microanalyticScore/modules/customercampaign1_0/steps/execute_skel`

Note: The sas-microanalytic-score URL is not available from the client machine. To run the service from a Linux machine, you’ll need to use the following URL: https://$MYUSER-gel.eastus.cloudapp.azure.com:443/microanalyticScore/modules/customercampaign1_0/steps/execute_skel. The URL until the port 443 is only valid for this workshop.

2. The web service input message (the message to POST to the endpoint) must be in the following JSON format. Note that the ADDRESS and PHONE sub-tables (Data Grids) are represented in the JSON like any other field. The sub table structure is not specified.

{""version"":1,""inputs"":[
    {""name"":""ADDRESS_"",""value"":""{{ADDRESS}}""},
    {""name"":""AGE_"",""value"":""{{AGE}}""},
    {""name"":""CUSTOMERID_"",""value"":""{{CUSTOMERID}}""},
    {""name"":""FIRSTNAME_"",""value"":""{{FIRSTNAME}}""},
    {""name"":""LASTNAME_"",""value"":""{{LASTNAME}}""},
    {""name"":""PHONE_"",""value"":""{{PHONE}}""}
    ]
}

Score the Decision

To score the decision, get a new access token and score the decision:

# Get SAS Viya token
ACCESS_TOKEN="$(curl -X POST "${INGRESS_URL}/SASLogon/oauth/token" \
  -H "Content-Type: application/x-www-form-urlencoded" \
  -d "grant_type=${GRANT_TYPES}&client_id=${CLIENT_ID}&client_secret=${CLIENT_SECRET}" | jq -r '.access_token')"
echo $ACCESS_TOKEN

# Execute Decision
DECISION_URL="${INGRESS_URL}:443/microanalyticScore/modules/customercampaign1_0/steps/execute"
echo $DECISION_URL

curl -X POST "${DECISION_URL}" \
  -H "Authorization: Bearer ${ACCESS_TOKEN}" \
  -H "Content-Type: application/json;charset=utf-8" \
  -H "Accept: application/json" \
  --data '{
  "version":1,
  "inputs":[
  {"name":"ADDRESS_","value":[{"metadata":[{"ADDRESS1":"string"},{"ADDRESS2":"string"},{"CITY":"string"},{"ESTIMATEDPRICE":"decimal"},{"STATE":"string"}]},{"data":[["42 Red Storm St","","Lancaster",300000,"PA"],["24 Coastal Ave","21","Beach Town",400000,"NJ"]]}]},
  {"name":"AGE_","value":52},
  {"name":"CUSTOMERID_","value":1},
  {"name":"FIRSTNAME_","value":"John"},
  {"name":"LASTNAME_","value":"Frizzel"},
  {"name":"PHONE_","value":[{"metadata":[{"PHONENUMBER":"string"},{"PRIMARY":"decimal"},{"TYPE":"string"}]},{"data":[["555-4304",0,"Work"],["555-3209",1,"Mobile"],["555-4172",0,"Home"]]}]}
  }' | jq

You will see:


{
  "links": [],
  "version": 2,
  "moduleId": "customercampaign1_0",
  "stepId": "execute",
  "executionState": "completed",
  "metadata": {
    "module_id": "customercampaign1_0",
    "step_id": "execute"
  },
  "outputs": [
    {
      "name": "address",
      "value": [
        {
          "metadata": [
            {
              "ADDRESS1": "string"
            },
            {
              "ADDRESS2": "string"
            },
            {
              "CITY": "string"
            },
            {
              "ESTIMATEDPRICE": "decimal"
            },
            {
              "STATE": "string"
            }
          ]
        },
        {
          "data": [
            [
              "42 Red Storm St",
              "",
              "Lancaster",
              300000,
              "PA"
            ],
            [
              "24 Coastal Ave",
              "21",
              "Beach Town",
              400000,
              "NJ"
            ]
          ]
        }
      ]
    },
    {
      "name": "age",
      "value": 52
    },
    {
      "name": "customerID",
      "value": 1
    },
    {
      "name": "firstName",
      "value": "John"
    },
    {
      "name": "lastName",
      "value": "Frizzel"
    },
    {
      "name": "makeOffer",
      "value": 1
    },
    {
      "name": "phone",
      "value": [
        {
          "metadata": [
            {
              "PHONENUMBER": "string"
            },
            {
              "PRIMARY": "decimal"
            },
            {
              "TYPE": "string"
            }
          ]
        },
        {
          "data": [
            [
              "555-4304",
              0,
              "Work"
            ],
            [
              "555-3209",
              1,
              "Mobile"
            ],
            [
              "555-4172",
              0,
              "Home"
            ]
          ]
        }
      ]
    },
    {
      "name": "propertyCount",
      "value": 2
    },
    {
      "name": "realEstateValue",
      "value": 700000
    }
  ]
}

This is similar with the 05_041_Execute_SAS_Micro_Analytic_Service step 3. Score the SAS decision published to SAS Micro Analytic Service use the following code:

Conclusion

You learned how to work with Data Grids in SAS.


Using SAS Intelligent Decisioning on SAS Viya 4
Lesson 07, Section 0 Exercise: 06 052 Azure SQL DB

Azure SQL Server and Database

Create an Azure SQL database Server and a database that will be later used in a data query.

Objective

You might only want to bid for cars which are not in a debt recovery database.

You need to create a decision that queries a table in an Azure SQL database. The table contains all the cars in a debt recovery database. If the car’s VIN is found in that database, a variable Hit is initialized.

You might then want to call the autoAuction decision, a decision within a decision only for the cars which are not in the debt recovery process.

Obviously, you do not want to waste the time, nor tie the capital in a potentially long legal process, therefore you can simply avoid those cars.

Our target looks like:

Create an Azure SQL Server and database

For click-by-click instructions, watch the demonstration: Demo: Data Queries Az 01


Create an Azure SQL Server:

################################
# Credits: Uttam Kumar
# https://gitlab.sas.com/GEL/workshops/PSGEL286-sas-viya-4-data-management-on-azure-cloud/-/blob/main/scripts/5_1_Create_Azure_MSSQL_Server.sh
################################

# Create an Azure MS-SQL server under your Resource group

## Variables
# Get your USER Prefix
getprefix

# Azure LOGIN
azlogin

# Get workshop variables
RG=$(cat ${GELENABLE_DIR}/vars/variables.txt | grep resource-group | awk -F'::' '{print $2}')
LOCATION=$(cat ${GELENABLE_DIR}/vars/variables.txt | grep location | awk -F'::' '{print $2}')
PREFIX=$(cat ${GELENABLE_DIR}/vars/variables.txt | grep prefix | awk -F'::' '{print $2}')
TENANTID=$(cat ${GELENABLE_DIR}/vars/variables.txt | grep tenant | awk -F'::' '{print $2}')
WORKSHOP_SUBSCRIPTION_ID=$(cat ${GELENABLE_DIR}/vars/variables.txt | grep subscription | awk -F'::' '{print $2}')
SUBSCRPTN=$(cat ${VARS_DIR}/variables.txt | grep subscription | awk -F'::' '{print $2}')
PREFIXNODASH=$(echo $PREFIX | sed 's/-//g')
CLIENTID=$(az ad sp list --filter "displayname eq '${GIT_WKSHP_CODE}_sp'"  --query '[].{appId:appId}' -o tsv)

# Get TAGS
TAGS=$(cat /home/cloud-user/MY_TAGS.txt | sed 's/[,"]//g;s/ = /=/g')

# Get Storage Account
SANAME=$(echo $PREFIX | sed 's/-//g')sa

# DB variables
AKS_MY_USER=viyadep
sql_server_name=${RG}-sqlsrv
sql_server_admin_user=$AKS_MY_USER
sql_server_admin_pwd=lnxsas@2020
sqldb_name=geldb

###  Create a SQL Server  ####
echo "Creating a SQL Server  "
sql_serv_sts=`az sql server create \
--name $sql_server_name \
--admin-user $sql_server_admin_user \
--admin-password $sql_server_admin_pwd \
--resource-group $RG \
--location $LOCATION \
--enable-public-network true \
--assign-identity \
-o tsv `
echo $sql_serv_sts

It will take about 5 minutes to create the SQL Server.

Wait for the creation of the server before moving further!

Check the Creation in the Azure Portal

On the Azure portal in the $RG resource group you shall see a SQL Server.

Create SQL Server Firewall Rules

### Create a Firewall Rule at MS-SQL Server
echo "Creating a Firewall for Azure SQL Server "
firewall_rule_sts=`az sql server firewall-rule create  \
-g $RG \
-s $sql_server_name \
-n $RG-rule \
--start-ip-address 0.0.0.0 \
--end-ip-address 0.0.0.0 `
echo $firewall_rule_sts

### Create a 2nd Firewall Rule at MS-SQL Server
echo "Creating a Firewall for Azure SQL Server "
firewall_rule_sts2=`az sql server firewall-rule create  \
-g $RG \
-s $sql_server_name \
-n $RG-caryip-rule \
--start-ip-address 149.173.0.0 \
--end-ip-address 149.173.255.255 `
echo $firewall_rule_sts2

Third Firewall Rule

On your RACE Windows machine > Start > write cmd > open the command prompt > ipconfig

Note the value corresponding to IPv4 Address, for example 10.96.3.246

MYIP=enter_copied_IP_here

Create a firewall rule for a range of addresses:

### Create a 3rd Firewall Rule at MS-SQL Server
echo "Creating a Firewall for Azure SQL Server "
firewall_rule_sts3=`az sql server firewall-rule create  \
-g $RG \
-s $sql_server_name \
-n windows-ip-config \
--start-ip-address $MYIP \
--end-ip-address $MYIP `
echo $firewall_rule_sts3

Check the Creation in the Azure Portal

In the SQL Server, go to Networking from the left blade:

Networking Rules

  • Rule 1: In the Networking section, a SQL Server firewall rule was created allowing access from Azure services. Notice how Allow services and resources to access this server is checked.

The above rule created this check. It is important to have it checked to have access from SAS Viya, which is running in the AKS cluster.

  • Rule 2: Created a rule for the Cary VPN IP: --start-ip-address 149.173.0.0 and --end-ip-address 149.173.255.255,

  • Rule 3: Created a rule for your RACE Windows machine IP:

If you want to query from outside, do not forget this important setting! IPs from where you want to access, have to be included in a firewall rule.

Create a Database in the SQL Server

echo "Creating a database at Azure SQL Server "
sql_serv_db_sts=`az sql db create \
--name $sqldb_name \
--server $sql_server_name \
--resource-group $RG \
--service-objective Basic \
--edition Basic \
--tags name=$TAGS \
--zone-redundant false `
echo $sql_serv_db_sts

We will check the database later.

Conclusion

You created the necessary Azure infrastructure to work with your data query in SAS Intelligent Decisioning.

End



Using SAS Intelligent Decisioning on SAS Viya 4
Lesson 07, Section 0 Exercise: 06 054 SAS Viya Config Azure SQL

Query the SQL database from SAS

You can create tables in the SQL database directly from SAS Studio.

For click-by-click instructions, watch the demonstration: Demo: Data Queries Az 02

Collect Variables

Copy your SQL Server Name:

Log in to SAS Viya

Connect to SAS Viya via URL with User and Pass.

SAS Studio

Switch to SAS Studio:

  • Add /SASStudio/ after the SAS Viya URL or,
  • Choose from the Applications menu, Develop Code and Flows.

Connect to AZ SQL database

  • Write a NEW SAS program to connect to the database:
  • Paste you Azure SQL SRV server name after the %let MYSRV='my_SQL_server_name';
  • Replace my_SQL_server_name with your SQL Serever
  • For example: sbxbot-p03205-rg-sqlsrv.database.windows.net

With SAS COMPUTE

Copy the program below and make sure MYSRV points to your Azure SQL Server:

/* ##################################################################
# Code Snippet Credits: Nicolas Robert ; Uttam Kumar ; Bogdan Teleuca ;
##################################################################### */

/* important variables */

* Add you SQL SRV HERE! ;
%let MYSRV=my_SQL_server_name;
*%let MYSRV='sbxbot-p03205-rg-sqlsrv.database.windows.net'; *# Az SQL Server - example only;
%let MYUID='viyadep'; *# Az SQL DB User;
%let MYPWD='lnxsas@2020'; *# Az SQL DB Pass;
%let MYDB='geldb'; *# Az SQL DB;

* Hardcoded library name - change $RG below;
libname sqlnew sqlsvr
      complete="Driver=SAS ACCESS to MS SQL Server;IANAAppCodePage=106;db=geldb;uid=viyadep;pwd=lnxsas@2020;Host=&MYSRV;port=1433;SSLLibName=/usr/lib64/libssl.so.10;CryptoLibName=/usr/lib64/libcrypto.so.10;ValidateServerCertificate=0;EnableScrollableCursors=4;"
      schema="dbo" ;

/*
* This won't run because of the quotes... ;
libname sqlnew sqlsvr
      complete="Driver=SAS ACCESS to MS SQL Server;IANAAppCodePage=106;db=&MYDB;uid=viyadep;pwd=&MYPWD;Host=&MYSRV;port=1433;SSLLibName=/usr/lib64/libssl.so.10;CryptoLibName=/usr/lib64/libcrypto.so.10;ValidateServerCertificate=0;EnableScrollableCursors=4;"
      schema="dbo" ;
*/

* Save table to az sql database ;
data sqlnew.cars;
set sashelp.cars;
run;
quit;

Execute the program and open the new library and SQL database table.

From CAS

New SAS Program: * Copy your MYSRV server name from the previous sas code. * For example:

%let MYSRV='sbxbot-p03205-rg-sqlsrv.database.windows.net';
  • Then paste the following code:
/* ####################################################################
# Code Snippet Credits: Nicolas Robert ; Uttam Kumar ; Bogdan Teleuca ;
####################################################################### */

CAS mySession SESSOPTS=(CASLIB=casuser TIMEOUT=99 LOCALE="en_US" metrics=true);
caslib _ALL_ assign ;

* Drop if exists ;
proc cas ;
   table.dropCaslib / caslib="azsqldb" quiet=true ;
quit ;

* Hardcoded library name - change $RG below;
caslib azsqldb datasource=(
      srctype="sqlserver"
      conopts="Driver=SAS ACCESS to MS SQL Server;IANAAppCodePage=106;db=geldb;uid=viyadep;pwd=lnxsas@2020;Host=&MYSRV;port=1433;SSLLibName=/usr/lib64/libssl.so.10;CryptoLibName=/usr/lib64/libcrypto.so.10;ValidateServerCertificate=0;EnableScrollableCursors=4;"
        schema="dbo" ) libref=azsqldb ;

caslib _ALL_ assign ;


proc casutil outcaslib="azsqldb" incaslib="azsqldb";
list files;
list tables;
quit;


/* ## Load CAS data to MS-Sql Server caslib */
proc casutil outcaslib="azsqldb" incaslib="azsqldb";
droptable casdata="cars" quiet;
load data=sashelp.cars casout="cars" replace;
list files;
quit;

/* ## Save CAS data to MS-Sql Server database

proc casutil incaslib="azsqldb";
save casdata="cars" casout="cars_new" replace ;
list files; list tables;
quit;
*/

* CAS mySession TERMINATE;

Execute the program. Expand the caslib and the table CARS.

Delete the Created Objects

You have an example on how to connect from SAS. We will be creating the tables using SQL script, on the Azure portal. Therefore:

  • Delete the table in the library name SQLNEW, delete the library name.
  • Delete the table in the caslib AZSQLDB, delete the caslib.
  • Execute CAS mySession TERMINATE; before exiting SAS Studio.

Conclusion

You learned how to connect and save a table from SAS Studio.

End



Using SAS Intelligent Decisioning on SAS Viya 4
Lesson 07, Section 0 Exercise: 06 056 Azure SQL DB Query

Azure SQL DB Query Editor

You can query the DB in the Azure portal, via the Query editor option:

For click-by-click instructions, watch the demonstration: Demo: Data Queries Az 03


Query Editor

In the Azure portal:

  • Select your ${RG} (left blade > Resource Groups)
  • Choose your SQL Server ${RG}-sqlsrv
  • Go to SQL databases from the left blade.
  • Select geldb ($RG-sqlsrv/geldb)
  • From the left blade of the DB select Query editor (preview):

  • Wait to see a green tick next to the geldmui@gelenable.sas.com, SQL server authentication.
  • If you do not, in 10 seconds or so, refresh the browser.
  • Login using user and password:
    • Login: viyadep (Should be filled)
    • PWD: lnxsas@2020

Click on OK.

Troubleshooting

If you are having trouble connecting, see the following two paragraphs.

Log-in Issues

If you cannot log-in:

  • Make sure you are accessing from your PC.
  • You must be connected with VPN - Option 3 Direct to Cary. Refresh the page.
  • You must see the green tick on the geldmui@gelenable.sas.com, otherwise your IP will be blocked.

  • Continue as geldmui@gelenable.sas.com

Firewall Rules

You might get the following message:

Click on the Allowlist... to add a SQL Server firewall rule, to allow access from your IP.

In Query Editor

Once in query editor, which can be a little tricky to access…

Create Table

create table autohit
(
    Hit int,
    VIN varchar(50) not null
)
go

Run.

From the left:

  • Refresh
  • Expand Tables. You should see:

Insert Rows

INSERT INTO dbo.autohit
       ( Hit, VIN )
VALUES
       (1, '12345678901234560' ),
       (1, '12345678901234561' ),
       (1, '12345678901234562' )
go

Log:

Query succeeded: Affected rows: 3

Run.

Query Table

SELECT VIN, Hit
FROM dbo.autohit
ORDER BY VIN desc

Run.

Create Another Table

Write the following query to create another table and insert a few rows and Run:

create table autoauctioninput
    (
    Make varchar(50),
    Model varchar(50),
    state varchar(50),
    Year int,
    BlueBookPrice int,
    CurrentBid int,
    Miles int,
    OriginalInvoice int,
    OriginalMSRP int,
    VIN varchar(50)
    )
go

INSERT INTO dbo.autoauctioninput
       (Make,Model,state,Year,BlueBookPrice,CurrentBid,Miles,OriginalInvoice,OriginalMSRP,VIN)
VALUES
       ('Honda', 'Accord', 'LA', 2009,5000,3000,50000,30000,35000,'12345678901234567'),
('Kia', 'Soul', 'CA',2016,8000,9000,68000,18000,19500,'12345678901234568'),
('Honda', 'Civic', 'AR',2017,28000,20000,20000,32000,34000,'12345678901234569'),
('Ford', 'Fusion', 'CA',2012,9000,9000,70000,18000,19500,'12345678901234560'),
('Honda', 'Pilot', 'MN',2012,10000,3000,100000,45000,50000,'12345678901234561'),
('Tesla', 'X100D', 'CA',2017,80000,90000,5000,100000,100000,'12345678901234562'),
('Honda', 'CRV', 'PA' ,2009,12000,8000,270000,30000,35000,'12345678901234563'),
('Buick', 'Regal', 'NJ' ,2012,8000,7000,82000,35000,40500,'12345678901234564'),
('BMW', '328i', 'NY',2015,35000,40000,4000,55000,60000,'12345678901234565'),
('Scion', 'TC', 'PA',2016,10000,9000,20000,18000,19500,'12345678901234566'),
('Honda', 'Accord', 'MA',2010,12000,11000,80000,34000,35000,'12345678901234571'),
('Ford', 'F150', 'FL' ,2016,45000,46000,90000,68000,79500,'12345678901234572'),
('GMC', 'Terrain', 'SC' ,2015,40000,30000,40000,60000,65000,'12345678901234573'),
('Ford', 'Fusion', 'CA',2012,8000,9000,59000,18000,19500,'12345678901234574')
go

SELECT Make, Model, state, dbo.autohit.VIN AS VIN, Hit
FROM dbo.autoauctioninput LEFT JOIN dbo.autohit
ON (dbo.autohit.VIN = dbo.autoauctioninput.VIN)

Run.

You will see:

As an alternative here, we could have saved the same table from CASUSER, using SAS Studio.

Conclusion

This is what we want to achieve later in a decision, to exclude from the bidding process all the cars with Hit=1.



Using SAS Intelligent Decisioning on SAS Viya 4
Lesson 07, Section 0 Exercise: 06 058 Configure MAS Azure SQL DB

SAS Micro Analytic Service Config

In the previous hands on, you connected to the Azure SQL DB from SAS Studio and performed operations from CAS.

Now it is time to configure the connection in the SAS Micro Analytic Service POD, which is using another connection path.

For click-by-click instructions, watch the demonstration: Demo: Data Queries Az 04


Backup sas-micro-analytic-score Pod Config

If for any reason you will break the pod, ensure you can always come back to the previous state.

cd ~
echo $ROOTDIR

mkdir -p $ROOTDIR/mas
cd $ROOTDIR/mas
pwd && ls

export KUBECONFIG=~/.kube/config
echo $KUBECONFIG

kubectl -n $GELENV_NS get deployment sas-microanalytic-score -o yaml > mas-deployment-good.yaml
ls
cd ~

Later, if the pod does not restart, with the generated Yaml file - you can apply it and revert back to the previous working state: kubectl -n $GELENV_NS apply -f ~/PSGEL299-sas-viya-4-sas-in-database-technologies/mas/mas-deployment-good.yaml

Generate the SAS Micro Analytic Service connection string

cd ~
sql_server_name=$RG-sqlsrv
sql_server_admin_user=viyadep
sql_server_admin_pwd=lnxsas@2020
sqldb_name=geldb

echo "driver=sql;conopts=(DRIVER=MSSQLSVR;CONOPTS=(driver=SAS ACCESS to MS SQL Server;db=${sqldb_name};uid=${sql_server_admin_user};pwd=${sql_server_admin_pwd};Host=${sql_server_name}.database.windows.net;port=1433;SSLLibName=/usr/lib64/libssl.so.10;CryptoLibName=/usr/lib64/libcrypto.so.10;ValidateServerCertificate=0;))" > ~/MY_MAS_CON_STR.txt
cat ~/MY_MAS_CON_STR.txt

Copy the generated string, e.g.

driver=sql;conopts=(DRIVER=MSSQLSVR;CONOPTS=(driver=SAS ACCESS to MS SQL Server;db=geldb;uid=viyadep;pwd=lnxsas@2020;Host=sbxbot-p03193-rg-sqlsrv.database.windows.net;port=1433;SSLLibName=/usr/lib64/libssl.so.10;CryptoLibName=/usr/lib64/libcrypto.so.10;ValidateServerCertificate=0;))

Important Note

In the SAS Micro Analytic Service Documentation - Microsoft Azure SQL Database via ODBC Driver Reference you will see a reference to ENABLE_MARS=YES:

If you disable autocommit and use a Microsoft SQL database, you must set ENABLE_MARS=YES in your connection string. This setting enables SAS Micro Analytic Service to allow transactions under a given Microsoft SQL Server connection.

In our environment, you must remove it as it will throw an ERROR message in the pod log.

In SAS Environment Manager

Configure the service “Micro Analytic Score service”.

Go to Settings > search for micro > select Micro Analytic Score service:

Minimize the configuration instances.

Edit sas.microanalyticservice.properties configuration:

In connectionstring, paste in the string generated above, customized for your user; starts with driver=sql;conopts=(DRIVER=...)):

Save.

Restart SAS Micro Analytic Service

Restart the SAS Micro Analytic Service pod by entering the following statements:

# Find the pod name for  SAS Micro Analytic Service
MASPOD=$(kubectl -n $GELENV_NS get pods | grep microanalytic | awk 'NR==1{print $1}')

# Restart the  SAS Micro Analytic Service pod
kubectl -n $GELENV_NS delete pod $MASPOD

Continue to the next hands-on

You don’t need to wait, as it will take around five minutes or so.

Statements to check the pod and the logs.

# Get the NEW  SAS Micro Analytic Service POD
MASPOD=$(kubectl -n $GELENV_NS get pods | grep microanalytic | awk 'NR==1{print $1}')
kubectl -n $GELENV_NS describe pod $MASPOD
kubectl logs --selector='app=sas-microanalytic-score' -c sas-microanalytic-score

Repeat the logs statement a few times, say every 15 seconds:

kubectl logs --selector='app=sas-microanalytic-score' -c sas-microanalytic-score

Until you see only green successful messages or warnings such as: [SAS][ODBC SQL Server Wire Protocol driver][Microsoft SQL Server]Changed database context to 'geldb'.

{"headers":{"sas-content-type":"application\/vnd.sas.event;version=2","sas-deployment-id":"viya","sas-event-source":"SAS Logging Facility","sas-published-timestamp":"2022-09-12T09:31:38.613000+00:00"},"id":"26C6476D-1D91-A24B-8E88-B1A1919C0E53","payload":{"level":"warn","message":"[01000]WARNING: [SAS][ODBC SQL Server Wire Protocol driver][Microsoft SQL Server]Changed database context to 'geldb'. (0x1645)","messageID":"TKTS_JNL_MESSAGE","parameters":{"0":"01000","1":"WARNING: [SAS][ODBC SQL Server Wire Protocol driver][Microsoft SQL Server]Changed database context to 'geldb'.","2":"5701"},"properties":{"_lineNumber_":"118","_sourceFile_":"sklstoj.c","hostname":"sas-microanalytic-score-6bbc74497f-fkjqq","logger":"App.tk.MAS","processId":"62","thread":"00000022"},"source":"sas-microanalytic-score","timeStamp":"2022-09-12T09:31:38.613000+00:00","version":1},"payloadType":"application\/vnd.sas.event.log;version=1","timeStamp":"2022-09-12T09:31:38.613000+00:00","type":"log","user":"sas","version":2}

Only In Case of an Error

Only if you see ERROR when you run:

# Run the Debug statement
kubectl logs --selector='app=sas-microanalytic-score' -c sas-microanalytic-score

You have to find out what the error is about and solve it… Check your connection string - check the SQL Server Name inside, the db, the credentials,…

Conclusion

You learned how to add an Azure SQL DB to the SAS Micro Analytic Service.

End



Using SAS Intelligent Decisioning on SAS Viya 4
Lesson 07, Section 0 Exercise: 06 060 Data Query

Data Query

Write and test a data query.

For click-by-click instructions, watch the demonstration: Demo: Data Queries Az 05


Logic

We want to enrich the autoauctioninput table with the Hit variable from the autohit table. Then, we want to exclude all cars that have a Hit=1.

Best Practice

Always test your SQL in the Query Editor or another utility, before transforming it for SAS Intelligent Decisioning.

Resource: Query the DB.

Query Editor

In the Azure Portal, select your SQL Server and the DB geldb on this server. Select the Azure SQL Query Editor (user: viyadep, pwd: lnxsas@2020)

If for some reason you cannot connect to Query Editor, go further

select dbo.autoauctioninput.VIN AS VIN, Hit
FROM dbo.autoauctioninput LEFT JOIN dbo.autohit
ON (dbo.autohit.VIN = dbo.autoauctioninput.VIN)

The above code becomes in SAS Intelligent Decisioning:

/* include sqlReturnInfo */
SELECT dbo.autoauctioninput.VIN AS {:inputVIN:string:17}, Hit AS {:returnHit:decimal}
FROM dbo.autoauctioninput LEFT JOIN dbo.autohit
ON (dbo.autohit.VIN = dbo.autoauctioninput.VIN)

Log in to SAS Viya

Connect to SAS Viya via URL with User and Pass.

SAS Intelligent Decisioning

Switch to SAS Intelligent Decisioning:

  • add /SASDecisionManager/ after the SAS Viya URL or,
  • choose Build Decisions from the Applications menu.

New Code File

Go to Code Files > New Code File.

  • Name it AzSQLDbDataQuery
  • Choose type Data Query
  • Save.

Write the following:

/* include sqlReturnInfo */
SELECT dbo.autoauctioninput.VIN AS {:returnVIN:string:17}, Hit AS {:returnHit:decimal}
FROM dbo.autoauctioninput LEFT JOIN dbo.autohit
ON (dbo.autohit.VIN = dbo.autoauctioninput.VIN)
WHERE dbo.autohit.VIN = {?:queryVIN:string:17}

Code Compiled Correctly

Press Validate > Run Validation, if prompted.

When the code compiled correctly > Save.

Variables

Go to the Variables tab. Check the variables:

If you see them as in the screen above, go to Test the Data Query below. They are automatically created when you write the code.

INFO: Code Error

If you wrote SELECT VIN instead of SELECT dbo.autoauctioninput.VIN:

/* include sqlReturnInfo */
SELECT VIN AS {:returnVIN:string:17}, Hit AS {:returnHit:decimal}
FROM dbo.autoauctioninput LEFT JOIN dbo.autohit
ON (dbo.autohit.VIN = dbo.autoauctioninput.VIN)
WHERE dbo.autohit.VIN = {?:queryVIN:string:17}

your query would not know from which table to retrieve. It exists in both. It will throw an error:

You can check the logs:

 kubectl logs --selector='app=sas-microanalytic-score' -c sas-microanalytic-score

Which will result in:

You will notice the error, but that won’t help you debug the SQL Query.

sql_executer shows your SQL is wrong.

Put the code back to:

/* include sqlReturnInfo */
SELECT dbo.autoauctioninput.VIN AS {:returnVIN:string:17}, Hit AS {:returnHit:decimal}
FROM dbo.autoauctioninput LEFT JOIN dbo.autohit
ON (dbo.autohit.VIN = dbo.autoauctioninput.VIN)
WHERE dbo.autohit.VIN = {?:queryVIN:string:17}

Save.

Test the Data Query

Go to the Scoring tab > New test.

Select the data from CASUSER.AUTOAUCTIONINPUT (PATH caslib table).

You will be warned that you need to map variables.

Map the variables: queryVin to VIN:

Expand the Advanced section, map the output to AUTOHITOUT table name (write it in the box). The table sits in the CASUSER caslib. We will need a variable in the next section.

Run and look at the output:

Click on the AzSQLDbDataQuery_dgo data grid, where the rowCount = 1.

The result is a data grid containing the return fields specified in the queries.

Click on the AzSQLDbDataQuery_dgo data grid, where the rowCount = 0.

Close.

Code Transformation

The SQL for SAS Intelligent Decisioning is a bit exotic. For more info, read:

INFO ONLY: Solution

To create the Data Query variables, you can import them from solutions/AzSQLDbDataQuery_variables.csv:

To import the variables, download the .csv file first on your PC and import them in SAS Intelligent Decisioning interface:

Variable Type Input Output
AzSQLDbDataQuery_dgo Data grid checked unchecked
queryVIN Decimal checked unchecked
returnCode Integer unchecked checked
rowCount Integer unchecked checked

Conclusion

Congrats, you learned how to call data from an AZ SQL DB in a Data Query.

End



Using SAS Intelligent Decisioning on SAS Viya 4
Lesson 07, Section 0 Exercise: 06 062 Decision

Decision with Data Query

You might only want to bid for cars which are not in a debt recovery database.

You need to create a decision that queries a table in an AZ SQL DB. The table contains all the cars in a debt recovery database. If the car’s VIN is found in that database, a variable Hit is initialized. You might then want to call the autoAuction decision, a decision within a decision only for the cars which are not in the debt recovery process.

Our target looks like:

Create a New Rule Set


Go to Rule sets. Create a new one called autoHit.

Go to the Variables tab. Create the following variables. You can import them from /solutions/autoHit_variables.csv file.

To import the variables, download the .csv file first on your PC then Import in the SAS Intelligent Decisioning interface:

Variable Type Input Output
AzSQLDbDataQuery_dgo Data grid checked checked
Hit Decimal unchecked checked

Rule

Create a rule with two assignments:

  • ASSIGN Hit 0
  • Add assignment:
  • Hit = DATAGRID_COUNT(AzSQLDbDataQuery_out)

Save.

Test the Rule Set

Go to the Scoring tab > New test.

Select the AUTOHITOUT. Refresh if you do not see the table. This is the output of the Data Query test.

Map the variables: AzSQLDbDataQuery_out to AzSQLDbDataQuery_dgo:

Run and look at the output. Click on Results:

A variable Hit was initialized, based on a data grid function.

Create a New Decision


Call it autoHitDec.

Select AzSQLDbDataQuery from My Folder.

Variables

Go to the Variables tab. Create the following variables. You can import them from solutions/autoHitDec_variables.csv file.

To import the variables, download the .csv file first on your PC then Import in the SAS Intelligent Decisioning interface:

Decision Flow

Add to the Decision Flow:

Data Query

Add the AzSQLDbDataQuery from My Folder:

Map the variables:

  • Input: queryVin to VIN:

  • Output: AzSQLDbDataQuery_dgo to AzSQLDbDataQuery_out:

Rule Set

Under the data query add the autoHit from My Folder:

Branch

Under create a branch:

Select the Branch variable Hit:

Add 0 in the Value.

Decision

On the Hit=0 branch, select it with the mouse, add the autoAuctionDec from My Folder:

As a result:

Save the decision.

Test the Decision

Go to the Scoring tab > New test.

Select the AZSQLDB.AUTOAUCTIONINPUT. Refresh if you do not see the table.

Run and look at the output. Click on Results:

Conclusion

You can see three cars were excluded from the bid process.

Two of the three excluded cars are from California. In the autoAuctionDec these cars would have received: Buy Anything from California!!! bidCommand.

But in the autoHitDec, the data query retrieved data from the Azure SQL geldb DB. In the dbo.autohit table, those three cars are involved in a debt recovery process.

The objective of this decision was to exclude the cars from the bid process.

End



Using SAS Intelligent Decisioning on SAS Viya 4
Lesson 07, Section 0 Exercise: 06 094 Azure DBs Clean

Azure SQL DB Clean Resources

The correct remove sequence should be the following:

  • Remove the connection string from the SAS Micro Analytic Service pod, via settings in SAS Environment Manager. See 06_058_Configure_MAS_Azure_SQL_DB.
  • Restart the SAS Micro Analytic Service pod and make sure the new pod is Running.
  • Remove the Azure resources using the code below:

IF you just remove the Azure resources, your SAS Micro Analytic Service pod will be in an Error state, disrupting other operations.

Remove DB

## Variables
# Get your USER Prefix
getprefix

# AZ LOGIN
azlogin

# Get workshop variables
RG=$(cat ${GELENABLE_DIR}/vars/variables.txt | grep resource-group | awk -F'::' '{print $2}')
PREFIX=$(cat ${GELENABLE_DIR}/vars/variables.txt | grep prefix | awk -F'::' '{print $2}')
sql_server_name=${RG}-sqlsrv
sqldb_name=geldb

# Delete the DB
az sql db delete --name $sqldb_name --resource-group $RG --server $sql_server_name --yes

# Remove Server
az sql server delete --name $sql_server_name --resource-group $RG --yes

End



Lesson 08

Using SAS Intelligent Decisioning on SAS Viya 4
Lesson 08, Section 0 Exercise: 07 016 Python Code File

Python Code File

Quick Test the SAS Micro Analytic Service Python configuration.

For click-by-click instructions, watch the demonstration: Demo: Python 01


Test Data

Go to /home/cloud-user/PSGEL267-using-sas-intelligent-decisioning-on-sas-viya/solutions/ folder and download the solutions/x1x2.csv file on the client machine.

Import > local file it with SAS Environment Manager and create an in-memory table in CASUSER called X1X2.

Code File

1. Within SAS Intelligent Decisioning, create a new code file > Python code file. Name it “mypy.”

2. Copy the code, paste and save. Careful with the indenting. Python functions are sensible to those spaces. Save.

''' List all output parameters as comma-separated values in the "Output:" docString. Do not specify "None" if there is no output parameter. '''
''' List all Python packages that are not built-in packages in the "DependentPackages:" docString. Separate the package names with commas on a single line. '''
''' DependentPackages: '''
def execute (x1,x2):
   'Output:y'
   y = x1 + x2
   return y

3. Sync Variables will create the inputs and outputs from the code. Click on each variable name.

4. Change the variable data type to decimal!

Save.

Test

Scoring > New Test > click on Variables next to the input table.

Map the variables > choose Use value, for example:

  • x1: 7
  • x2: 6

Run.

Test will work as SAS Micro Analytic Service host access mode specified requires an authenticated host account was configured at the image start-up.

Open the result, you should see 13 (x1+x2):

Decision

For click-by-click instructions, watch the demonstration: Demo: Python 02


1. Within SAS Intelligent Decisioning, create a new decision. Name it “pyDec.”

2. Create the following variables. Add variable > Code file > select My Folder / mypy > select all variables.

3. Add the code file > Python code > mypy

Validate and Save.

Publish to SAS Micro Analytic Service

4. Publish the decision to SAS Micro Analytic Service.

Success!

Publishing Validation

5. Go to Scoring > Publishing Validation.

6. Edit. Choose the input table X1X2 from CASUSER:

7. Run the SAS Micro Analytic Service publishing validation with the decision and CAS table. Go to Output.

Publishing Validation Works

8. Go to Output:

You will see how the Python code function was executed in SAS Micro Analytic Service.

Conclusion

You created a simple Python file, a decision and tested the publishing validation in SAS Micro Analytic Service.

You tested the validation result and you could check the scoring validation output.



Using SAS Intelligent Decisioning on SAS Viya 4
Lesson 08, Section 0 Exercise: 07 018 Python Code Decision

Python Code File

With the warm-up finished, let’s integrate a real code file in our decision.

Complete the Hands-On Exercises

If you haven’t completed, the previous hands-on exercises, do it now. This exercise builds on those.

  • 02_21_Rule_Set.
  • 02_31_Lookup_Table.
  • 03_061_Decision.
  • 03_062_Model.

As an alternative, to save time:

  • Download from solutions: 03_062_Model_carScore_Project and 03_062_Model_myFolder.

  • Create the CAS table mentioned in 02_21_Rule_Set or after importing 03_062_Model_myFolder, go to SAS Studio and run Load_AutoAuctionInput.sas.
  • Import the solutions:
    • 03_062_Model_carScore_Project then
    • 03_062_Model_myFolder.
  • For a reminder on the import click-path, see 99_010_ImportSolution for more details.
  • Go to SAS Intelligent Decisioning:
    • Activate each of the two lookup tables: unwantedMakes and bidCommands.
    • Open autoAuctionDec.
    • Validate and save the decision.
    • Test the decision. Make sure there are no errors.


For click-by-click instructions, watch the demonstration: Demo: ImportSolution 02


New Python Code File

Objective: replace californiaOverride DS2 file with a Python equivalent.

We want to rewrite the DS2 code:

package "${PACKAGE_NAME}" /inline;
   method execute(in_out double Bid,
                  in_out varchar bidCommand,
                  in_out varchar state);
    if state = 'CA' then do;
       Bid = 1;
       bidCommand = 'Buy anything from California!';
   end;
   end;
endpackage;

as a Python file. The Python file is not producing the same result, but similar:

# CalifoniaOverride PyMAS function
def execute(Bid, state):
    "Output: bidCommand"
    if Bid == 0:
        bidCommand = 'Do NOT bid on this car!!!'
    else:
        bidCommand = 'Bid on this car!!!'
    if state == 'CA':
        bidCommand = 'Buy anything from California!'
    return bidCommand

# test the PyMAS function (Python 3)
Bid=0
state='PA'
print(execute(Bid,state))

Bid=1
state='PA'
print(execute(Bid,state))

Bid=0
state='CA'
print(execute(Bid,state))

For click-by-click instructions, watch the demonstration: Demo: Python 03


Test the Python Code on sasnode01

cd ~
# Create CalifoniaOverride PyMAS function
tee  ~/pymas.py > /dev/null <<EOF

#!/usr/bin/env python
# coding: utf-8
# CalifoniaOverride PyMAS function
def execute(Bid, state):
    "Output: bidCommand"
    if Bid == 0:
        bidCommand = 'Do NOT bid on this car!!!'
    else:
        bidCommand = 'Bid on this car!!!'
    if state == 'CA':
        bidCommand = 'Buy anything from California!'
    return bidCommand

# test the PyMAS function (Python 3)
Bid=0
state='PA'
bidCommand=execute(Bid,state)
print(f"With bid {Bid} and state {state} the bidCommand is: {bidCommand}")

Bid=1
state='PA'
bidCommand=execute(Bid,state)
print(f"With bid {Bid} and state {state} the bidCommand is: {bidCommand}")

Bid=0
state='CA'
bidCommand=execute(Bid,state)
print(f"With bid {Bid} and state {state} the bidCommand is: {bidCommand}")

EOF

ls

# Test Python
python3 pymas.py

You should see:

With bid 0 and state PA the bidCommand is: Do NOT bid on this car!!!
With bid 1 and state PA the bidCommand is: Bid on this car!!!
With bid 0 and state CA the bidCommand is: Buy anything from California!

SAS Studio

You can test the Python code in SAS Studio, as the Python configuration is only complete for MAS.

Log in to SAS Viya

Connect to SAS Viya via URL with User and Pass:

SAS Studio

Switch to SAS Studio:

  • Add /SASStudio/ after the SAS Viya URL or,
  • Choose from the Applications menu, Develop Code and Flows.

Write a:

Python Program

# CalifoniaOverride PyMAS function
def execute(Bid, state):
    "Output: bidCommand"
    if Bid == 0:
        bidCommand = 'Do NOT bid on this car!!!'
    else:
        bidCommand = 'Bid on this car!!!'
    if state == 'CA':
        bidCommand = 'Buy anything from California!'
    return bidCommand

# test the PyMAS function (Python 3)
Bid=0
state='PA'
bidCommand=execute(Bid,state)
print(f"With bid {Bid} and state {state} the bidCommand is: {bidCommand}")

Bid=1
state='PA'
bidCommand=execute(Bid,state)
print(f"With bid {Bid} and state {state} the bidCommand is: {bidCommand}")

Bid=0
state='CA'
bidCommand=execute(Bid,state)
print(f"With bid {Bid} and state {state} the bidCommand is: {bidCommand}")

See the log:

SAS Program Containing Python

proc python ;
   submit ;
# CalifoniaOverride PyMAS function
def execute(Bid, state):
    "Output: bidCommand"
    if Bid == 0:
        bidCommand = 'Do NOT bid on this car!!!'
    else:
        bidCommand = 'Bid on this car!!!'
    if state == 'CA':
        bidCommand = 'Buy anything from California!'
    return bidCommand

# Test the PyMAS function (Python 3)
Bid=0
state='PA'
bidCommand=execute(Bid,state)
print(f"With bid {Bid} and state {state} the bidCommand is: {bidCommand}")

Bid=1
state='PA'
bidCommand=execute(Bid,state)
print(f"With bid {Bid} and state {state} the bidCommand is: {bidCommand}")

Bid=0
state='CA'
bidCommand=execute(Bid,state)
print(f"With bid {Bid} and state {state} the bidCommand is: {bidCommand}")

   endsubmit ;
run ;

Very well!

Python Decision

For click-by-click instructions, watch the demonstration: Demo: Python 04


1. Within SAS Intelligent Decisioning, duplicate the decision autoAuctionDec. Name the new duplicate: autoAuctionDecPy. Open it.

2. Below californiaOverride add a new Python code file in My Folder. Name it stateBid.

3. Open the code file editor. Paste inside:

''' List all output parameters as comma-separated values in the "Output:" docString. Do not specify "None" if there is no output parameter. '''
''' List all Python packages that are not built-in packages in the "DependentPackages:" docString. Separate the package names with commas on a single line. '''
''' DependentPackages: '''

# CalifoniaOverride PyMAS function
def execute(Bid, state):
    "Output: bidCommand"
    if Bid == 0:
        bidCommand = 'Do NOT bid on this car!!!'
    else:
        bidCommand = 'Bid on this car!!!'
    if state == 'CA':
        bidCommand = 'Buy anything from California!'
    return bidCommand

The code file was tripped of the tests executed on the jump box. Save.

4. Go to Variables. Click on each one of them and change their type:

Variable Name Data Type Input Output
Bid Boolean checked unchecked
bidCommand Character unchecked checked
state Character (length: 2) checked unchecked

Save. Close.

5. In the main decision flow, remove californiaOverride DS2 code.

Validate. Save the decision.

Publish to MAS

Publish the decision to MAS.

Scoring

For click-by-click instructions, watch the demonstration: Demo: Python 05


Go to Scoring > Publishing Validation.

Perform publishing validation with the AUTOAUCTIONINPUT input table. Run. Click on the Results > Output:

The Python file is at work in MAS.

The bidCommand variable is always filled now, as the Python code replaced the DS2 file.

Publish to Azure

If you have defined the publishing destination, you can publish the decision to Azure. Azure supports decisions containing Python code.

Publish to Git

If you have defined the publishing destination, you can publish the Decision to Git. Git supports decisions containing Python code.

Open the scoreResource.txt file and identify your Python code.

Hint: search for Bid == 0:

Test the Decision

Go to Scoring > Test it with the AUTOAUCTIONINPUT input table.

The test is successful, because Python is configured in this SAS Viya instance with MAS, CAS and Compute. You can consult the log message.

Summary

Python Code in SAS State of the Union:

  1. If you need to publish a decision with Python code to SAS Micro Analytic Service - publishing validation uses the SAS Micro Analytic Service pod. You need Python configured with SAS Micro Analytic Service pod.

  2. If you test a decision with Python in SAS ID - it uses CAS modelPublishing action set. It embeds Python in DS2 code. You need Python configured with the CAS pod, CAS enable settings and CASHOSTAccount.

  3. If you want to test your Python code, to know what to put in a decision code file - you need SAS Studio, which runs in SAS Compute. You need Python config for SAS Compute pod and compute and lockdown settings.

Therefore, make a distinction between a decision test, that uses CAS or SAS Compute and a publishing validation, that uses only MAS.

Conclusion

You learned how to create a decision containing Python code.

END OF HANDS-ON

Thank you for your time following this GEL Virtual Learning Environment!

The last step is to clean up the environment.

Go NOW to 99_002_FinalHouseKeeping and help SAS save on Azure costs.

Thank you.


Lesson 12

Using SAS Intelligent Decisioning on SAS Viya 4
Lesson 12, Section 0 Exercise: 99 002 FinalHouseKeeping

Final Housekeeping

Run the following command on the sasnode01 MobaXterm session to delete all objects created in Azure, including SAS Viya Azure Kubernetes Service Cluster, Azure Container Registry, SQL Server, Web Apps, if applicable:

cleanup

You can now sign out from your Windows client and terminate your RACE reservation.


Using SAS Intelligent Decisioning on SAS Viya 4
Lesson 12, Section 0 Exercise: 99 010 ImportSolution

Import Solutions

If you haven’t completed the previous hands-on exercises, you can import a solution.

This exercise uses as an example 02_21_Rule_Set.

For click-by-click instructions, watch the demonstration: Demo: ImportSolution

Steps:

  • With sasnode01 > Sftp download the file from /home/cloud-user/PSGEL267-using-sas-intelligent-decisioning-on-sas-viya/solutions

  • Import it with SAS Environment Manager:

  • Mapping > Preview > Import: