The AdHoc Reporting call in the public web services API means PaperCut administrators can generate reports “on-demand” and do some pretty cool things! For example, you can nudge users to improve their printing behavior by sending individual environmental summary reports to users’ who fail your “benchmark” measurement.
For people who feel comfortable writing system management scripts, we’ve created an example for you to play with and adapt for your own purposes.
This KB post gives you some extra notes to help get you started.
We can break our example into three parts:
Setup Generate a list of users and their reports Email the report Setup function def setup(): ''' Return a dictionary of config values and base report name''' server = proxy.api.getConfigValue(auth, "notify.smtp.server") smtp_username = '' password = '' port = '' protocol = '' # Validate if not server: print("Cannot find SMTP server. Aborting email") return None else: smtp_username = proxy.api.getConfigValue( auth, "notify.smtp.username") password = proxy.api.getConfigValue(auth, "notify.smtp.password") port = proxy.api.getConfigValue(auth, "notify.smtp.port") protocol = proxy.api.getConfigValue( auth, "notify.smtp.encryption-method") if not port: print("Cannot find port. Aborting email") return None if not password: print("Cannot find password. Aborting email") return None if not smtp_username: print("Cannot find username. Aborting email") return None return {"port": port, "server": server, "password": password, "username": username, "protocol": protocol}</pre></section> Generating reports function We start by splitting chunking our users in batches chunks of 1000 and we filter these out by checking if they have an email address associated with their account. Then for each use we generate the report and pass it over for emailing
def generate_reports(details): chunk = 0 # Initial chunk users = proxy.api.listUserAccounts(auth, chunk, 1000) while (len(users)) > 0: for user in users: # we are only interested in users with emails email = proxy.api.getUserProperty( auth, user, "email") if email: print(f'Generating for {user}') file_name = f'{user}_environment_impact_user_environmental_impact_summary_{datetime.now().strftime("%d-%b-%H-%M")}.pdf' data = { "user-name": user, } res = proxy.api.generateAdHocReport(auth, "USER_ENVIRONMENTAL_IMPACT_SUMMARY", json.dumps(data), "PDF", f'{user}_environment_impact', "/tmp" ) send_mail(file_name, email, details) # Increase chunk chunk += 1000 # Reassign chunk users = proxy.api.listUserAccounts(auth, chunk, 1000)</pre></section> Sending mail function This function creates an email object, which we use to attach our generated report and our email body. After the object is created, we connect to an SMTP server and send off the email
def send_mail(file_name, email, details): # We start off by fetching all of our STMP settings from PaperCut. # We build our message object and fill it with the appropriate details: message = MIMEMultipart() message["From"] = sender_email message["To"] = email message["Subject"] = subject message["Bcc"] = email # Recommended for mass emails message.attach(MIMEText(body, "plain")) # Grab our generated report and attach it to our email with open(f'/tmp/{file_name}', "rb") as attachment: # Add file as application/octet-stream # Email client can usually download this automatically as attachment part = MIMEBase("application", "octet-stream") part.set_payload(attachment.read()) # Encode file in ASCII characters to send by email encoders.encode_base64(part) # Add header as key/value pair to attachment part part.add_header( "Content-Disposition", f"attachment; filename= {file_name}", ) # Add attachment to message and convert message to string message.attach(part) text = message.as_string() # We need to check the protocol because that will dictate how we send the email if details[“protocol”] == “SSL”: with smtplib.SMTP_SSL(details[“server”], details[“port”], context=context) as server: server.login(details[“username”], details[“password”]) server.sendmail(sender_email, email, text) # If we use TLS, we need to establish a connection with the SMTP server. elif details[“protocol”] == “TLS”: with smtplib.SMTP(server, details[“port”]) as server: server.ehlo() # Can be omitted server.starttls(context=context) server.ehlo() # Can be omitted server.login(details[“username”], details[“password”]) server.sendmail(sender_email, email, text) # If we get no protocol from the PaperCut server, we don’t wanna send unencrypted emails. else: print(“An encryption protocol is required.”)
Performance Please note: The script provided is an example and was designed for ease of use. This section will discuss some strategies to enhance the performance of the application.
Generating reports and sending emails are heavy tasks and as such performance may be quite slow.
PaperCut reports are cached for future use so the first run will always run slower. As some example metrics, the provided script provided the following timings:
First run:
real 0m22.291s
After caching:
real 0m15.018s
This was run on a MacBook Pro 2.6 GHz Intel Core i7 (12 Cores) while generating reports for 3 users.
There are some tricks you can do to improve performance, like using multithreading to send emails and this will undoubtedly lower the time. However, as it stands there is a bottleneck.
Using these tricks, I managed to get those timings down to about 13 seconds for the first run and about 6 seconds, after caching, with the same test case. Another optimization you can do is only generates reports for users that actually have pages printed — this way we don’t need to generate empty reports. The effect of this is, the fewer reports we need to generate the faster the application can run.
You could alternately export the report in CSV for all users in the system and generate your own HTML page presenting the information. The benefit of this approach is that you only need to generate one report and therefore we save a bit of overhead there. This does, however, become a bit more technically challenging because you need to parse the CSV and process the information.
Still have questions? Let us know! We love chatting about what’s going on under the hood. Feel free to leave a comment below or visit our Support Portal for further assistance.
Comments
0 comments
Please sign in to leave a comment.