Beyond Impact Blog

Learn Powershell in 5 (More) Painless Steps: Data - Movement

Mar 16, 2017 7:51:24 AM / by Cole McDonald

We've dealt with what types of data we can store in increasingly complex ways. It's time to start dealing with a discussion of "we've got it, what can I do with it? Let's start with the basics of object oriented programming.  An object is an instance of a class.   OOP, there it is!

If we think of our programs we're writing in terms of a series of blocks of code which accepts specific types of data in, does stuff to them, then hands them off to another block of code to do more stuff with, the method of moving that data becomes important. If it's all within a single computer, huge custom objects can be made and stuffed into a variable and that variable can be passed between each of the blocks of code to work on just the pieces of information it needs to do its specific job.

Now we can make a new variable with this new class and start populating it. The ID needs to be unique and built when we create the item automatically so we don't have to keep track of it. I'm using the date down to the second. If we're at a single location entering inventory items manually at a terminal, this should work just fine. If it's an automated scanning process, there's a potential for entering more than 1/second. If this is the case, we'll need to use more granularity to our time based IDs. Milliseconds should fill this purpose change your format string to "yyyyMMddHHmmssFFF" to get that fine control.

What if we want to enrich the interface by having location specific information there for when we turn the greasy monitor for our POS (Point of Sale, give our software some credit) to the customer to show them where they can find it? We'll need to be able to add items from multiple locations in a way we can recall easily. We can add pictures of the doohickeys and whatzits as well. Now that we've figured out what we need to store, let's build our class. I'm jumping straight into the more complex version as we already know how to make a flat class like we'd make for a single location inventory system:ere for capacity, security, or availability and only used by a single front end application, we can store each return internally and update just the piece we need to externally with each transaction. At that point, we can code to check the internal "cached" DB first, then off to the primary DB if it's not found there. Double checks play into the process here as we want to make sure that our internal always matches the primary DB.

Here are the blocks of code we've identified:

# Part request

# Database Query

# Bank Transaction

# Inventory Update

 

## Bonus Round ##

# User Info Grab

# Google Ads Notification

If you've followed my previous tutorials, this looks familiar. Each of these main parts can be broken down to smaller and smaller pieces of pseudo code and each piece written to make the whole process function. Now that we look at it, we identify another piece that is mentioned but not described. The database needs to hold the information for each of these major pieces of code to access. The function frame work looks like this:

function Interface_Request {

  param ( [OurInventoryClass]$DB_Object )

  # Uses the internal DB to populate the interface with

  # available nodgets and whoodinkies

  Write-Object $DB_Object

}

function Database_Query {

  param ( [OurInventoryClass]$DB_Object )

  # Uses the internal DB to query against to find the

  # specific nodget you're looking for

  Write-Object $DB_Object

}

 

function ... etc

We'll note that I keep mentioning the internal DB. We have to fill it from somewhere, so we'll need a DB_Fill function up top. Each of the functions above receives a parameter of $DB_Object and returns it as well. This is efficiently written code. Internally, most languages store an object in a chunk of memory, then pass just its address back and forth for direct access without having to pass the whole thing each time. If you're versed in your ancient (1980s) Doctor Who lore, this is how the TARDIS Works. The blue box is just a door to the static dimension where the interior of the TARDIS resides. In our [spacetime], just the blue box moves. Much less energy expended than having to move a whole dimension with each trip. Yes, I'm a geek, what tipped you off?

If we start to scale up, as always seems to be the case with these tutorials, this starts to break. Once we pass information across a network, the smaller the info, the faster we can move to the next piece of the process. So, if we reference a SQL cluster on a box attached via a gigabit connection, the amount we transfer is limited by that pipe's bandwidth. If we're using the internet, it's the slowest connection that defines the drag on the system. If we encrypt and decrypt, even more time... we begin to see the importance of the amount of data we're moving from place to place. If the DB is elsewhere for capacity, security, or availability and only used by a single front end application, we can store each return internally and update just the piece we need to externally with each transaction. At that point, we can code to check the internal "cached" DB first, then off to the primary DB if it's not found there. Double checks play into the proces here as we want to make sure that our internal always matches the primary DB.

For the purposes of the rest of this week's tutorial, we're going to assume an internal only dataset loaded from outside at the beginning of the process and written to storage only at the end of the process.

As we look at the types of tasks the functions will need to do, we'll need to make sure that we're storing our information in a way that makes it easy for each function to get to:

 

- ID

- Part Number

- Part Name

- Part Description

- Part Cost

- SKU

- Quantity

 

If we have multiple locations, the quantity is dependant on store or warehouse as well.

 

- Location

  - Quantity

  - Building ID

  - Shelf

  - Slot

 

What if we want to enrich the interface by having location specific informaiton there for when we turn the greasy monitor for our POS (Point of Sale, give our software some credit) to the customer to show them where they can find it? We'll need to be able to add items from multiple locations in a way we can recall easily. We can add pictures of the doohickeys and whatzits as well. Now that we've figured out what we need to store, let's build our class. I'm jumping straight into the more complex version as we already know how to make a flat class like we'd make for a single location inventory system:

 

# Please note, Powershell is the wrong language for this.
# But I know everyone's got it, so I'm using it.
# It can handle these structures as of Powershell 5.0
 
class inventory_item {
  [int64]$ID = (get-date -format "yyyyMMddHHmmss")
  [int]$partNumber
  [string]$partName
  [string]$description
  [float]$cost
  [int]$SKU
  [string]$Image
 
  # Quantity per location
  [system.collections.arraylist]$location = @()
 
  # Method
  add (
    [int]$storenumber,
    
[string]$building,
    
[int]$shelf,
   
[int]$slot,
    
[int]$quantity
  ) {
   
# The index makes it easier to lookup later
   
# It has to start with a letter in Powershell to enable use of
   
# numbers and underscore
   
$index = "STORE_$($storenumber)"
  
    
# Test to see if location already exists and increment
    
if ($this.location.$index) {
     
$this.location.$index.quantity += $quantity
    } else {
     
# Add the inventory info to the specific store/part
      
$this.location.add(
        
@{
         
"$index" = @{
           
store = [int]$storeNumber
           
building = [string]$building
           
shelf = [int]$shelf
           
slot = [int]$slot
           
quantity = [int]$quantity
          
}
        
}
      
)
    
}
  
}
}

 

Now we can make a new variable with this new class and start populating it. The ID needs to be unique and built when we create the item automatically so we don't have to keep track of it. I'm using the date down to the second. If we're at a single location entering inventory items manually at a terminal, this should work just fine. If it's an automated scanning process, there's a potential for entering more than 1/second. If this is the case, we'll need to use more granulatiry to our time based IDs. Milliseconds should fill this purpose change your format string to "yyyyMMddHHmmssFFF" to get that fine control.

 

If you're reached sub-millisecond generation of these inventory objects, you may want to check your bank account, someone has bene setting off your echo and making purchases using your account. In this case, you can use GUIDs instead:

 

static[guid]$ID = [guid]::NewGuid()

 

Feel free to make a doohicky or a nodgit and start populating it:

 

[inventory_item]$nodgit = new-object -typename inventory_item

$nodgit.partNumber = 12345

$nodgit.partName = "Overthruster"

...

 

Feel free to make up your own entertaining description here and assign it to the object to help lighten the mood. This tutorial is getting much more geeky than normal; I apologize.

 

Once we get to our actual quantities is where this structure starts to shine:

 

$nodgit.add( 42, "A", 3, 27, 101 )

 

We can get that information back out pretty easily.

 

$nodgit.location.STORE_42

$nodgit.location[0].values.quantity

 

Now, if we have variables prepopulated with some of the values, we can more easily look up our information. These can be manually populated or have the values filled from a pulldown menu.

 

$myStore = "42"

$myPartSearch = "12345"

 

$inventoryCheck = "STORE_$($myStore)"

$nodgit.location.$inventoryCheck

 

We're set up now to be able to add new stores to the part.

 

$nodgit.add( 41, "A", 3, 27, 101 )

 

We can get our list of indices to populate our store pulldown menu using the KEYS object:

 

$nodgit.location.keys

 

And if we want, we can add new stock to the store by issuing the same object method call:

 

$nodgit.add( 41, "A", 3, 27, 25 )

 

Now if we look at our locations altogether, we can get just the quantities to total in a foreach loop to get a full count:

 

$nodgit.location.values.quantity

 

I know we've spent a lot of time this week dealing with building our class and it seems more like a structural blog than the advertised movement from the blog title. The reason for this is that if it's being used on a single system, the entirety of the topic is that we can now simply pass $nodgit wherever we need it.

 

function PartDescriptionUpdate {

  param (

    [inventory_item]$part

    [string]$description

  )

  $part.description = $description

  write-output $part

}

 

$newDescription = "This overthruster is the finest technology has to offer across the eighth dimension"

$nodgit = PartDescriptionUpdate -part 12345 -description $newDescription

 

If we're working across a network with multiple locations, we'll want to minimize the information being sent across. In this case, a main system will be responsible for handling the full objects and assigning priority and "Locking" to prevent two stores from updating the same object on top of one another. When we're working like that, we'd just pass the part number, store number, and description from the "Client" system and the "Server" will store a queue with each change to be made in order of submission. The server will then need to provide a "Success/Fail" response to the client to let the operator know if the process failed due to a collision of some sort.

 

Perhaps a store had 20 doohickeys left and two other locations needed 15 of them. When they pulled up the part initially, the system told them there were enough. Whoever asks for them first gets 15 parts delivered to them. The next request submitted will fail as there are now 5 of them in the warehouse, but it wants 15. At that point, a $false return would allow the POS you're writing to display a "Not enough $($partName)s available to fulfill your request. Thank you for shopping Banzai Industries."

 

We'll build this out a little bit next week and figure out these systems of queueing, verifying, deleting parts from the object and discuss how we can store this information. We'll add more parts to see how this will scale as well. We'll probably make a few more 80's pop culture references that are obscure enough to make us have to use google to look up what it is we're talking about.

------------------------------------------------------------------------

Did you find this article useful?  Let me know at cole.mcdonald@beyondimpactllc.com

If you want to be kept informed, follow our RSS feed: http://blog.beyondimpactllc.com/blog/rss.xml

Learn more about PowerShell in Azure

Beyond Impact is a Cloud Hosting and Managed Services provider based in Minneapolis, Minnesota. 
You can learn more about our Cloud Services at beyondimpactllc.com/azure-services/.

Tags: 5 painless steps, powershell, object oriented programming

Cole McDonald

Written by Cole McDonald

Internet Pioneer, Digital Futurist

Subscribe to Email Updates

Recent Posts