git commit -m 'gay'
All of the blunderous commit messages from John.
Made the configuration script executable. Modified the shebang to point to Bash rather than ZSH since the Linux installations this will be used on do not have ZSH installed.
Added an initial timestamp value of 1 year from the current moment. This is ensures date comparison with the files retrieved from SharePoint when determining which is the oldest file to delete.
Removed the test variables that hard-coded values to allow for testing on my own system. Removed the block comment for tarball creation that was added to avoid executing that code during the SharePoint development.
Added a check for the contents of the directory to which the tarball was uploaded. If the directory has more than 4 items, the oldest of the items is processed and deleted. With a 1 week cron job, this will allow for a rolling series of a month of backups.
Refactored the process of uploading so that the folder is completely handled prior to initiating it. Created a conditional to chart if the folder exists or needs to be created.
Wrapped the match for the hostname to a directory in SharePoint in a conditional. Automated the creation of a new hostname-based directory if a match is not found.
Added math to determine how many 10 MB chunks the tarball will need to be broken into in order to guarantee that the upload to SharePoint will be successful. Added a loop to upload the bytes of the file in the appropriate number of chunks.
Updated .gitignore file to not include any backup tarballs that have been captured. Added the ability to query for the target file to receive its file ID from SharePoint and use that to create a file upload session.
Removed the venv directories from the repository. Created a configuration script and added a requirements.txt file for pip3.
Cleaned up test code. Added a call to the function to back up the log file past 10 MB.
Added the ability to write a uniquely-named CSV file with the id, serial number, MAC address, and name of each matching router.
Added the baseline code for gathering the group list, parsing out the URLs for the relevant groups, and then checking routers for membership of those groups. Router details are stored in a class.
Added a script to import the .csv file with speed test results. The report exists on the local filesystem of the collector, so the Groovy code imports from there and parses out the appropriate information, formatting it in a way that LogicMonitor can ingest. Also included a parsing script to handling the instancing of the data from Active Discovery.
Modified the router object creation to remove commas from the name. This will prevent errors when parsing the .csv file later where the delimiter causes a skew to each row.
Modified the latency and jitter values to convert them from microseconds to milliseconds. Also added parsing to force a conversion of the string result value to a double, and try-catch blocks to handle any failed conversions. Changed the backoff time when waiting for the tests to finish after the initial completion estimate from 2 minutes to 5 minutes.
Removed testing code.
Updated the device discovery to only include customer routers from 2 MSP management groups in NCM.
Changed the router discover for a customer to only include the routers in one of the two MSP management groups configured in NCM.
Added individual counters to track the number of open tickets by block-of-hours or prepaid support along with the total number of open tickets.
Added checks to account for when there are no results for either flagged malware or phishing. This is useful at the start of the month given that the check is month-to-date for both scripts. It avoids graphs showing as having no data and breaking within the Dashboard view.
Wrapped the output of the variables to ensure proper expansion.
Modified the script to account for time different billable types between support contract and block of hours. Metrics on monthly tickets, total time, and average time per ticket are calculated for each billable type independently. The script will also now determine the total number of open tickets across both billable types.
Merge pull request #4 from jfabry-noc/Bob_test Added ERS Switch OID active discovery.
Fixed a typo on the dumps method for json when we hit a subscription expiration notice.
Re-added a code block that was accidentally removed in 5291837 to check the routers for current or outdated firmware.
Updated the discovery scripts to omit routers for a particular customer which were not members of particular groups.
Modified the router discovery to filter out devices for a customer if they do not match particular group membership.
Modified the parsing to make a per-device query for additional metrics like the memory and CPU utilization. These are only returned when checking for a specific device, not when querying for all of the devices, making the additional queries necessary outside of discovery.
Fixed a bug I introduced in 7d7b730 where I changed the commit message to be grammatically correct. This broke my filter for those messages from posting to the site and creating an endless loop of posts. The filter should now work again for the new commit message, and the errant postings have been cleaned up.
Added a new .gitignore entry for an external Python file for the shared secret used to authenticate communication. Updated the webhook listener to now calculate the hash of the response body with the shared secret as SHA256 in order to validate that the message came from a legitimate server. This is used in conjunction with firewalling access to just the documented IPs to the Flask server.
Moved the writing of the JSON file to a function and added error checking to it.
Changed the phrasing of the commit message from commits to commit(s) so that it doesn't look quite as gross by being grammatically incorrect when only a single commit is included.
Updated the check for a router in the alert to compare the key to None rather than if the key exist. Added a conditional block to handle expiring subscription alerts.
Updated the listener to grab a header provided by the push service. The header is a hash of the secret key in the push policy and the body of the message so that pushed data can be authenticated.
Modified the listener to remove references from the API call to an undefined function used in a previous script. Alerts featuring a router query the router's friendly name and write a local JSON file for ingestion into LM.
Fixed a semi-duplicate entry in the HTML because I managed to create a merge conflict for myself by forgetting to Already up to date. before committing in a copy of the repo that I hadn't used for a while.
Moved the music link back to my Apple Music profile.
Removed __pycache__ since that was not meant to be committed originally. Modified the push_configs file provided from CP to allow me to quickly re-enable the push config while I'm testing the handling of the data. Updated the listener to start parsing out the data provided from the hook and create local files for LM.
Included the config push script supplied by CP. Updated .gitignore to not include a config_constants file with connection-specific details to import.
Added an indicator to the reboot prompt to let the user know they need to enter Y or N. The code ignores the case.
Updated the periodic displays of what stage the script is at to clearly delineate them from the command outputs. We do not want to suppress the command output, so this just makes it easier to keep tabs on what is executing without excessive hunting.
Added a conditional to the start of the script to check for the uid. If it isn't 0, which would mean the script was executed with sudo, the user is notified and the script exits. Several sections were updated to include -y on apt commands that were previously missing. The default php.ini setting includes the " =" after the ";date.timezone" listing. The sed commands were updated to reflect this.
Merge pull request #3 from jfabry-noc/Bob_test Bob test
Added a check for the current PHP version. If it's 7.2, the script will add the PPAs and install everything. If it's already 7.4, the script will start straight at the configuration stage.
Modified the parts of the script calling another user account so that all commands can execute under the new user before gracefully exiting back to the original shell without user intervention.
Added the option to specify 'DEV' as a second parameter. If provided, development mode will be enabled. In this mode, the user will be prompted after each stage of the script if they would like to continue or kill the operation. Prevents the script from running amuck when testing it if, for example, the last command executes successfully but does not yield the expected result.
Added sed commands for modifying the php.ini files where a timezone update is necessary. Also added a sed command to escape the backslash in the timezone name so that it can be re-used in another sed command later without escaping.
Added additional installations for the full suite of dependencies. Added a CLI parameter where the timezone must be specified as one value out of an array. Improperly specifying a timezone will result in the options being printed and the script halting. Added a prompt to the very end of the script to reboot the entire server.
Updated the .gitignore file to remove unneeded Python references but to avoid capturing a test script and the .swp files from Vim.
Added a base shell script for updating the PHP components from 7.2 to 7.4
Added a different sample route to the listener.
Additional HTML fixes for the tag closing bug.
Merge branch 'main' of https://github.com/jfabry-noc/GitCommit into main
Fixed a bug where the paragraph tags for the commit comment were being closed by an h3 tag. I also updated the current entries in the HTML that had been written in this way.
Redid the phishing code to move from BatchScript to just script and handle instancing through Active Discovery in LogicMonitor.
Modified the script to print only the prefix of the email address to simplify the BatchScript creation process within LogicMonitor.
Created a script for a LogicMonitor DataSource to process the phishing results from the local collector's filesystem as a BatchScript. Also updated the .gitignore file to avoid commiting the sample result .csv file.
Added a check for an existing result file and removing it prior to starting anything else.
Shifted the script to be designed for running as a Scheduled Task or cron job rather than by LogicMonitor directly due to the authentication requirements. Added logging.
Created a new script file for querying the O365 quarantine for messages filtered there via transport rule. The script will validate that the appropriate transport rule caught the message (not filterable in the original call) and then report back how many users have received phishing messages blocked this way. Handles the metrics on a month-to-date basis.
Added the baseline files that were created for generic webhook listener testing. Also included the sample nginx.conf file for Gunicorn as the Flask + Nginx serving combo.
Removed redundant logging when querying each repo for commits. Modified the HTML generation to stop adding an extra newline to the end of the HTML file each time. Removed a repeated line in the HTML file because I forgot how daylight saving time works.
Modified the script to allow for commit messages on the GitCommit repo itself if they aren't the ones automatically generated for the sake of CI/CD to Netlify.
Modified to remove a commit that will become duplicate due to my testing. Again because of the fact that I can't do block comments, apparently.
Fixed yet another bug introduced by my inability to do block comments.
Addressed a bug where I was erroneously doing an extra comparison for the time. This was code from prior to when the filter was applied to the URL. Any returned commits should be considered valid if my account is the author.
Updated the Desk metrics to temporarily include the customer's internal blackhole alias.
Modified the EventSource and corresponding URL filter to account for the job running every 15 minutes rather than every 30 minutes. The time variable is in milliseconds.
Added a new link to gitcommit.gay and updated the CSS to include the alternate GitHub icon for it.
Moved the default timeframe from 60 seconds to a 20 second duration.
Created processing for each child directory in the SharePoint root, correlating it with the hostname of the Dashboard. If there is a match, parse together the new SharePoint URL for uploading the backup tarball and clearing out old tarballs.
Added the ability to query the MS Graph for the contents of a SharePoint directory.
Added skeleton code for handling a connection to the MS Graph and checking the validity of a Graph token. Also added to the repository the Python MSAL package and the dependencies for it.
Modified the hourly collection script to execute every 15 minutes.