Posts

Well... chatGPT, collab, automation, monolithic code and bad APIs

Let's just be honest about the state of things from an automation standpoint:

import pandas as pd
import re

# Variable Definitions
template_file_name = 'phonebuttontemplate.csv'
device_file_name = 'deviceprofile.csv'
modified_device_file_name = 'deviceprofile_modified.csv'
replacement_value = 'DELETED'
end_column_number = 89
print_line_counts = False  # Flag this if you encounter issues and want to print the Line count for each row in the template csv
first_column_name = 'Device Profile Name'

# Read phonebuttontemplate.csv
print("Reading phonebuttontemplate.csv...")
template_df = pd.read_csv(template_file_name)

# Store header row
header_row_template = list(template_df.columns)

# Initialize a dictionary to store the values
template_dict = {}

# Loop through the DataFrame rows
for _, row in template_df.iterrows():
    name = row.get('NAME', None)
    # Check if the row has a NAME and is not empty
    if pd.notna(name):
        # Check each column for "Line", "Speed Dial", or "None" under the "TYPE OF FEATURE \d{1,2}" pattern
        count = 0
        empty_value_encountered = False
        for col in header_row_template:
            if re.match(r'TYPE OF FEATURE \d{1,2}', col):
                value = row[col]
                if value == 'Line':
                    count += 1
                elif value == 'None':
                    empty_value_encountered = True
                    break
        # If an empty value was encountered, stop counting and break the loop
        if empty_value_encountered:
            break
        template_dict[name] = {'count': count + 1}  # Start the count from 1
        # If print_line_counts is True, print the name of the template and the count of its lines
        if print_line_counts:
            print(f"{name}: {count + 1} lines")

# Read deviceprofile.csv
print("Reading deviceprofile.csv...")
device_df = pd.read_csv(device_file_name)

# Store header row
header_row_device = list(device_df.columns)

# Iterate over the rows in deviceprofile.csv
for index, row in device_df.iterrows():
    template_name = row['Phone Button Template']
    device_profile_name = row[first_column_name]

    if pd.notna(template_name) and pd.notna(device_profile_name):
        # Check if the template name exists in the template_dict
        if template_name in template_dict:
            template_info = template_dict[template_name]
            count = template_info['count']
            directory_number_col = f'Directory Number {count}'
            call_id_presentation_col = f'Calling Line ID Presentation When Diverted {end_column_number}'
            device_df.loc[index, directory_number_col:call_id_presentation_col] = replacement_value

# Write the DataFrame to a new file
device_df.to_csv(modified_device_file_name, index=False)

print("Processing completed successfully.")

The prominent issue in terms of coding is chatGPT, which needs to be addressed when considering collaboration (collab).

To begin with, it would be beneficial to develop a universal UC API that can merge various methods and functions, enabling the automation of any task. Although this idea is intriguing, it necessitates the complete rewriting of these large-scale applications. The goal would be to leverage AI and prioritize an API-first approach.

Once this is accomplished, a standardized UC dashboard could be implemented. This would unify platforms like Webex and CUCM, allowing for seamless provisioning, integration, and communication without the need for intermediary components. These components were not specifically designed for the purpose and still rely on outdated Unix code, essentially being rooted in the Tandberg system.

I propose a significant transformation, a genuine movement empowered by AI, to bring UC up to date and in line with current advancements.

Anyhow, that's my hot-take.  I am working on automating all the provisioning in my lab, check back here in a couple months.  Maybe I will have a solid solution.  As it stands right now I just have a certificate generation, request and load API for various UC devices.  

That is the start of my ucEngineersToolKit API

Good day and may it be a great one!

Lab Rebuild

So, we have been rebuilding the lab, ESXi is back up and I am getting my VM's ready.
I will post an update once I am ready :)

So I thought I would have this lab ready by now...

But obviously, that won't be quite ready today. Spinning up 2 CUCM clusters, to set parameters on the SIP trunks and make a LUA scripting post... I also hit a really nice BUG with FIPS on 14 SU2, haven't confirmed it's on SU1, and working on that in a production environment has taken up more time than I like to admit.

Update, we have both call managers stood up, it will be an A side to B side across a trunk from Jabber clients injects and parsing key value pairs via LUA.

LUA Scripting... Labbing... Moved... Long time no see.

So it's been ages since I posted last, and covid and a great many things happened. Lest we shall blissfully ignore this all and go back to collabing. So recently I had to implement a LUA script that takes custom headers from an IVR and passes them as such, they are key value pairs to the application. There are multiple transparency references out there, but once I have my lab back up, I will resume and test some implementation so I can properly document them. Unfortunately, everything I implemented was proprietary to their SaaS IVR.

So onto the labbing... the states of things:

\

Do take notice, it's not done and I am travelling for work for the next few weeks. Let's hope it goes better from here and the site when it comes to updates again. And oh yeah, labs shut down, what was the point of this site if no one can get their CCIE? Well, there's my excuse, bye!

Policy Based Routing on a Nexus

First of all, let's just admit, I am a collab guy, doing work on a Nexus is a foreign experience. While some things are intuitive not everything follows RFC's and the Nexus platform is a bit more restrictive.

First of all, to get PBR going, I had to change the hardware profile, since I have no intention of using QoS internally in my lab, I decided to steal from it:

hardware profile tcam region qos 0
hardware profile tcam region pbr 256

This then requires a reboot. 

After this is done and the pbr feature is enabled:
feature pbr

You can get into creating the route-maps.  However, unlike IOS devices you can only use permits in your acl's and then use a deny statement on the route map itself:

ip access-list PBR_DENY
statistics per-entry
10 permit ip any 192.168.1.0/24

ip access-list PBR_PERMIT
 statistics per-entry
 10 permit ip any any

route-map PBR_RULE pbr-statistics
route-map PBR_RULE deny 10
 match ip address PBR_DENY
route-map PBR_RULE permit 20
 match ip address PBR_PERMIT
 set ip next-hop 192.168.10.1

Google Domains, no API and a lot of collab edge sadness

Not that it's not something that can't be worked around, but I was enjoying using Google Domains' own name servers and not having to manage my own.  However, given the fact I use Let's Encrypt and I would rather not renew my own certs every 90 days and the fact you need A records for each SAN... well, I guess I have to spin up my own External DNS server.  Luckily, Google Domains use DNS SEC, so I will have to do that this weekend.

I found that my old ASA Let's Encrypt script was not perfect, I will also update that this weekend with more logical checks.  It will need to be ran as a sudoer as well, as there is some mkdir commands that need to happen.  I have gotten to a point, where I will test it as a cron job. 

In the mean time, someone has asked me to automate Windows Server 2016 installs for their lab, so I guess I will do that tonight.

I haven't been posting lately, because I am leading a game development team outside of work, studying for the lab and frankly, no clue where I have any time to do even this post.

Script to quickly create a pastable list of AD users in Powershell.

I might rewrite this in powershell at some point, but due to my familiarity with python, I tossed this together to create a pastable list of AD users for a collaboration lab environment spin up.  Hopefully this saves someone time.

#!/usr/bin/python3
list = ["John Doe", "Jane Doe"]
for fullName in list:
   nameList = fullName.split(" ")
   firstName = nameList[0]
   lastName = nameList[1]
   print("$Attributes = @{")
   print("    Enabled = $true")
   print("    ChangePasswordAtLogon = $false")
   print("    PasswordNeverExpires = $true")
   print("    UserPrincipalName = \"" + lastName.lower() + firstName[0].lower() + "@join.com\"")
   print("    Name = \"" + lastName.lower() + firstName[0].lower() + "\"")
   print("    GivenName = \"" + firstName + "\"")
   print("    Surname = \"" + lastName + "\"")
   print("    DisplayName = \"" + fullName + "\"")
   print("    Office = \"Remote\"")
   print("    Company = \"Company\"")
   print("    Department = \"Support\"")
   print("    Title = \"Test User\"")
   print("    City = \"New York\"")
   print("    State = \"New York\"")
   print("    AccountPassword = \"CHANGEME\" | ConvertTo-SecureString -AsPlainText -Force")
   print("}")
   print("New-ADUser @Attributes")
   print("Set-ADUser -Identity " + lastName.lower() + firstName[0].lower() + " -Add @{\"msRTCSIP-PrimaryUserAddress\" =\"sip:" + lastName.lower() + firstName[0].lower() + "@join.com\"}")
   print("")