Dec 18, 2013

Creating A Generic Picklist Selection Dropdown in Apex

When creating a custom search feature with Visualforce and Apex, I normally create a few filters for relevant picklist fields. It's a nice way to give the user quick and easy way to search for records. The best thing about these custom picklist fields is that you can add the "All records" or "no records" options When creating the picklist options, in addition to the the picklist values that you can get from the schema.

The first thing you want to do is to write a generic method that builds the list of select options. With a generic method you can create several filters with the same code.
Next you want to create a getter that Visualforce can call to get the picklst values:
And lastly, you want to write a property that the Visualforce page can read and write to when ou use makes their selection. You can use that value in your search SOQL

Oct 10, 2013

Anatomy of the Group By Cube SOQL Query

Visit our website

Ahead of my Dreamforce session Custom Analytics Using SOQL Cubed Results (register here), I decided to write a preview post about the Group By Cube syntax you want to use.

Let's start with a simple SOQL statement that retrieves three fields from the Case object, and filter it to only bring back records that are closed. I always add aliases to the fields in Group By queries because it makes working with the aggregated results in Apex a little easier, and eliminates the need to include the namespace in Apex if you're packaging the code for the. AppExchange.

Next, add the GROUP BY CUBE Statement with all 3 fields that are being retrieved. The order of the fields here does not make a difference as groupings for all field combinations are returned in the query.

Next add some aggregate functions. Getting summarized information is why you use Group By Cube, so its you chance to go nuts with AVE(), MIN(), SUM(), COUNT(), etc. Don't forget to ad an alias for the function here as well.

Because cubed results return 8 types of groupings, we need a way to identify which fields are used for the subtotals of each row. For that we can use the SOQL Grouping() function, which is basically like asking the platform if the field values are included in the specific row.

The last thing that we want to add is a way to assure that the results are sorted in a predictable way. This will make processing in Apex more convenient and economical. For that we are going to add an Order By clause that will assure that each subtotal grouping will be presented together. We also want to sort the values within each grouping, so you should add the 3 fields from the select to the end of the Order By.

That's it! Here's how the query looks like:

This is what the query returns with some sample results:

Oct 2, 2013

Winter '14: Cron Job Name and Type in SOQL

A handy enhancement will be arriving in Winter '14 that makes working with Scheduled Jobs much cleaner. The update will allow you to query for the Name and Type of CronTrigger records.

In v28.0, you were limited to CreatedById, CreatedDate, CronExpression, EndTime, Id, LastModifiedById, NextFireTime, OwnerId, PreviousFireTime, StartTime, State, TimesTriggered, andTimeZoneSidKey.  Unfortunately, no way to query for the job name or type.

Winter '14's v29.0 brings with it a new object called "CronJobDetail" and a few fields (Id, JobType, and Name).  CronTrigger relates to a CronJobDetail through its CronJobDetailId field.

Sound semi-familiar?  Take a look at our earlier post, Syncing Salesforce Changes to an External System with Future/Schedule Architecture.  In this post, we schedule an Apex job to handle our future/schedule processing and create a small side object called CronJob__c.  We stored the ID of the scheduled CronTrigger so that we could remove the job afterwards.  This object is no longer necessary now that we can query for our CronTriggers by name.

The types that are provided back are numeric - here is what each value means:
  • 0 - Data Export
  • 3 - Dashboard Refresh
  • 4 - Analytic Snapshot
  • 7 - Scheduled Apex
  • 8 - Report Run
  • 9 - Batch Job

Sep 18, 2013

DeleteRestrictedByFkException not found in section Exception

While working on a project recently, our newest addition to the team, Will, stumbled upon an undocumented bug when handling exceptions.

Error: __MISSING LABEL__ PropertyFile - val DeleteRestrictedByFkException not found in section Exception

Scenario:  When deleting a record that is a parent on a restrictedDelete Lookup field, deletion will fail throwing an exception describing the child-records that exist, preventing the deletion.  Normally, passing this exception to an ApexPages.addMessages() for use on a Visualforce page within an <apex:pageMessages />, a VF_PAGE_MESSAGE is encountered and pushed to the page instead of the expected exception (as seen in the screenshot above).

Using a standard dev org, I was able to quickly replicate this, using the sample record data provided. One of the pre-made Accounts ("United Oil & Gas Corp.) was associated with a few different cases.

If you attempt to delete the Accout in the UI, you'll be directed to a new page that informs youthat it can't be removed due to the Account being related to various Case records (Case.AccountId).

Programatically, you can can do it with the following sample code.

Visualforce Page:

Apex Class:

When you access the new Visualforce page with the Account's ID (, you'll see:

The "Delete" button is designed to delete the Account record matching the ID in the URL.  We know we enountered an error in the UI, so we know to expect a page message along the lines of "System.DmlException: Delete failed. First exception on row 0 with id 001i000000JEOgkAAH; first error: DELETE_FAILED, Your attempt to delete United Oil & Gas Corp. could not be completed because it is associated with the following cases.: 00001001, 00001002, 00001021, 00001022, ..., ..., ..."

However, we see:

If we look at the debug logs, we can see the exception we wanted occuring ([7] below), but then we see the DeleteRestrictedByFkException VF_PAGE_MESSAGE occuring when the original exception is being passed to apexPages.addMessages(e) [9]:

I opened a Case with Salesforce support to see if there was another issue or if it was indeed a bug:
 "...this is a known issue with salesforce and we have noticed the same in the past which is currently with our R&D department, Unfortunately, we have not recieved any ETA for the fix..."
"We have noticed this error in different conditions and the R&D is working on the same to quickly fix this as soon as possible." 

If you're looking for a solution, I simply replaced:


Aug 29, 2013

Geolocation Fields

I've been working on a new Arduino project that uses a low power GPS module to log geographic coordinates as I'm driving around.  Every minute after the module has a fix, I log the date/time along with unit's current latitude and longitude to a CSV on a 4GB micro-sd.

The raw data looks like this
When I want to plot these coordinates onto a map, I head over to the Google Maps Engine, upload my data, and I end up with something that looks like this:

To and from the gym tonight
So what does this have to do with Salesforce?  Geolocation fields!  Perhaps you'll be attaching your very own GPS logger to an advertisement adorned weather balloon that will float over San Francisco during Dreamforce '13.  Swap the SD card out for 3G/GPRS and potentially transmit your aeronautic advertisement's coordinates back to your Salesforce org.  Or perhaps start with something simpler...

Geolocation fields are currently beta and available for all editions of Salesforce.  Check out the notes within the Salesforce documentation as this beta field type is laced with various limitations like:
Other limitations of this geolocation beta release include:
  • Geolocation fields are not supported in custom settings.
  • Geolocation fields are not available in dashboards, Visual Workflow, or workflow and approvals.
  • Geolocation fields cannot be searched.
  • Geolocation fields are not available in Schema Builder.
  • DISTANCE and GEOLOCATION formula functions are available only when creating formula fields and in Visual Workflow.
  • Geolocation is supported in Apex only through SOQL queries, and only at the component level.

Also, note that Geolocation field counts as three separate fields

  1. Latitude
  2. Longitude
  3. A mysterious internally designated field
When creating a new field, "Geolocation" is one of the recently added field types.  You may have noticed it, but if you didn't, you'll find it in the options between the "Email" and "Number" types:

On Step 2 of the new field creation wizard, you'll specify the coordinate display notation, being in either "Degrees, Minutes, and Seconds," or "Decimal" notation.  You'll also need to specify the precision by way of decimal places.  Looking at my test data above, I've recorded 5 decimal places and will specify this below.

Once you have your new field added to our object and assigned to a page layout, this is what your new Geolocation field(s) will look like.  Remember the note about each Geolocation field counting as three?  Here you will see the two that are visible to the user.

What about the 3rd field that was designated for internal use?  Here's what it looks like within the IDE Schema:

If you attempt to use it in a standard select SOQL query, you'll find that you're out of luck... but wait there's a catch!

Per Salesforce, Winter '13 brought along two formula functions that can be used within SOQL queries; Distance to determine the distance between two locations and Geolocation which is a pairing of a latitude and longitude.  An example usage of these would be a query that brings back records based on proximity to a set of coordinates.  

  • The use of "Geolocation" will allow us to specify a latitude and longitude
  • The use of  "Distance" will allow us to indicate a Geolocation field (you guessed it, that mysterious 3rd field), a new Geolocation to compare against, and whether we want to compare in miles ("mi") or kilometers ("km").

I hope you've enjoyed this post and as always, feel free to leave comments.  We'd love to hear how you're using Geolocation fields in your business environment.

Aug 26, 2013

CRM Science at Dreamforce '13

Three of the CRM Science geeks and one of CRM Science's clients will be led sessions at this year's Dreamforce ('s annual user and developer conference), held at San Francisco's Moscone Center, November 18th through the 21st.

Abstracts of our sessions are below with links to the Dreamforce org where you can still join the session Chatter and the YouTube recordings.

Create Powerful Reports Using and SOQL Cubed Results - Ami Assayag

"Group by cube" is a powerful feature of SOQL that every developer should have in their toolkit. It's a very handy tool when you need to create complex Visualforce reports because you can get aggregate results for different groupings of values in a single SOQL call. Join us as we go over an example of using one "cubed" SOQL call to generate pivot data of multiple Case fields, and use the data to build Visualforce charts that are not available through standard reportingSession Link

Chrome Extensions for Salesforce Professionals - Kirk Steffke

Learn about two new Google Chrome browser extensions on the Chrome Web Store and AppExchange that provide administrators and developers with time-saving tools and quick settings shortcuts to facilitate the day to day responsibilities they are faced with. These tools from CRM Science will save you time! Session Link

Answers Live Session 2 & Nonprofit HUB Answers Live Session 1 - Thomas Taylor

Making the Salesforce Success Community Answers a LIVE experience! Join us and get your burning non-developer-related Salesforce questions answered in a Q & A style forum. Get to meet and ask questions of some of the outstanding community contributors who have become the de facto Answers army. Nonprofit users can attend sessions targeted to their needs and ask questions of a panel with over 25 collective years of nonprofit-focused Salesforce experience.

Goin' Pro Bono: Leveraging Salesforce Professionals Within the NPO Space - Thomas Taylor

Preparing for successful skills-based volunteering projects for pro-bono ninjas.  Understanding the common risks and learning helpful tips for a pro-bono engagement.  We'll discuss setting reasonable expectations of the project and volunteer, barricading them from free reign, being a good consumer of this available resource, and signs the Pros are stumbling.  How to be a thoughful and effective non-profit volunteer. Session Link

Business Agility using Workflow & Approvals - PointRoll (CRM Science Client)

Not all logic needs to be coded programmatically.'s workflow and approvals engine lets you easily invoke tasks, trigger email alerts, database updates, or messaging based on business events. Join us to learn the platform's workflow capabilities, and hear how other customers have gained valuable business process improvements using workflows. Session Link

Jul 15, 2013

Schedule Batch Apex in Summer '13

You could always schedule a Batch Apex before Summer '13.  Though, a new system method in this release did make it significantly easier to do.

Previously, I would create my batch class, with its batchable interface declared and its start, execute, and finish methods within.

Here's a sample batch class that runs through all of an org's Contacts and checks a box called myCheckbox__c, if it isn't already checked.

Then, to keep things tidy, I would create a 2nd class with the schedulable interface and a simple method.

Here's that schedulable class:

Then you could queue up the batch class by executing something like this:

Your job will be scheduled, on the hour, every hour.

You could have also implemented both the batchable and schedulable interfaces in the same class and provided a method to do your scheduling.

The new system.scheduleBatch() method that was included in Summer '13

You don't need to:
  1. Implement the schedulable interface (in the same or separate class)
  2. Create a Cron Schedule String (the '0 0 * * * ? *' string above) 
  3. Clean up finished jobs (scheduled jobs hang around until they are manually deleted or system.abortjob() takes care of them.  ScheduledApex clean up after themselves and are removed after running.
The above can be condensed into this:

And scheduled by calling:

Within the system.scheduleBatch() method 3 (and an optional 4th) parameters are provided:
  1. Instance of the batchable class
  2. The Job's name
  3. # of minutes to start the job in
  4. Scope size - 200 is the default for a batchable class' execute method.  This allows you to specify up to 2000 records per batch - just beware of hitting additional limits if you do.

Jul 7, 2013

New SOQL Clauses and System Fields

Summer' 13 and v28.0 brought a few new SOQL features for developers.  Included in those features are the FOR VIEW and FOR REFERENCE SOQL Clauses and their related LastViewedDate and LastReferencedDate date/time fields.  The new fields, unlike the other system date/time fields are dependent on the current user, meaning your values will be different from mine (illustrated below).

Let's define what a view and a reference are, per the release notes.  Records are considered:
Viewed - "when the user sees the details associated with it, but not when the user sees it in a list with other records."  
Referenced = "when a related record is viewed."
Here's a sample scenario to help understand these fields and how the clauses can be used.

User A creates a "Kirk Steffke" Contact record that is related to a CRM Science Account.  Upon creation, the LastViewedDate and LastReferencedDate for both will the same as the CreatedDate.

Contact:  Kirk Steffke

Account:  CRM Science

Notice above the LastViewedDate and LastReferencedDate fields.  These values are for User A, the user that created both records.  Observe closely and you'll notice that the Account's LastReferencedDate is slightly later than the LastViewedDate.  This is due to creating the Account record through the "New" button in the Contact's Account Lookup window.  The Account is first referenced upon the Account record's creation and a second time when User A is taken to the newly created Contact record.  The reference comes from the Account Lookup field on the Contact.

Since these two fields are user dependent, let's see how the data differs for another user that has never accessed either record.

Account:  CRM Science as viewed by User B

Contact:  Kirk Steffke as viewed by User B

The LastViewedDate has an impact certain areas of the user interface such as the Recent Items list in the toolbar or the items that quickly appear when using the search feature.  Below is a side by side comparison for the two different users.  The new Contact and Account records do not display in the quick search pop-down or in the Recent Items.

As seen here, just like the other sytem date/time fields, they aren't directly updatable.

However, the new SOQL clauses will do just that:

Don't expect to see an updated value when you run the above query; the query will update the LastReferencedDate for any rows returned as they are returned.  When you use the FOR REFERENCE or FOR VIEW clauses, the values returned will never be the actual values, they'll be the last values.  To see the actual values without updating them, use the original query where the clauses are omitted.

Jun 27, 2013

Securely Storing Credentials (and other 'stuff' too)

Building on Ami's last post, Syncing Salesforce Changes to an External System with Future/Schedule Architecture, I want to expand on a solution for storing your external systems' credentials securely.

Many times, when developing a custom Salesforce application, there are settings unique to the application that need to be stored somewhere.  A common solution is a dedicated Salesforce object with the fields necessary to support these options; there will probably be Checkboxes and Picklists that control and shape how the app works.  If an external system is involved, you'll probably have additional fields for things like a Username, Password, Token/Cookie, and an Expiration Date of sorts.

While this may work for some orgs (in a rush, don't know any better, or just don't care), there are some improvements that can be made.  Minimally, this involves making sure that your Profile permissions are in check (you always do, right?).  A step up would be the use of an Encrypted Text Field to store your Password.  However, if you're developing an Application for the AppExchange, you need a solution that will pass the AppExchange Security review.

Before we start building, check out Secure Coding Storing Secrets, particularly the Apex and Visualforce Applications section.

It says that when included in a Managed Package, a "Protected" Custom Setting is "only accessible programmatically via Apex code that exists within your package."

For this blog post, we'll use a Custom Object to store our Application specific settings as referred to above.  To maintain usability, we'll keep the Password field, but create a trigger to take the user inputted value, store it in a Custom Setting (as recommended by the Salesforce page), and then mask the Password field on the Custom Object.

We'll start out by building a Custom Object to hold our application specific settings, including the expected fields for calling out to our external system.

"I'd don't always use the  Schema Builder, but when I do..."
"... it looks like this."

Next, let's define our App's Custom Setting, where in this post, we'll store our Password securely.  You can expand on this and store more or all of your credential details if you desire.

Browse to Setup --> App Setup --> Develop --> Custom Settings and click on the "New" button.

Provide the basic details for your Custom Settings (like a name that applies to your package and a suitable description).  Make sure that you select "List" for the Setting Type and "Protected" for the Visibility settings, then click on the "Save" button.

After saving, you should be sitting on the Custom Setting Definition details page for your new Custom Setting.  The next thing we need to do is to create our Password field for this Custom Setting.  Click on the "New" button within the Custom Fields block.

On the next few screens, go ahead and create a new text field called Password.

If you've never used Custom Settings before, you should be getting the feeling that they are very similar to Custom Objects.  So much like them, that you can even retrieve their records via SOQL queries.

We're done with schema changes, so let's take a look again at the process we're building.

  1. A User will create/update an Application Setting (the Custom Object we built above) record with a new Password
  2. A trigger will look for a row within our Package Custom Settings that matches the Username
    1. If found, the Custom Setting's Password will be updated
    2. If not found, a new Custom Setting will be created
  3. The trigger will then replace the Application Setting's Password field with 8 asterisks (********).
Trigger Code:

Let's see what happens when you create a new Application Setting:

After clicking on "Save" the first thing I should notice is my password will be replaced with "********."

So far so good!  Now let's check on our Custom Settings.  We should have one record, with the password I provided.

Let's update the Password and make sure that it changes the Custom Setting and continues to replace the Password in the Custom Object.

The Custom Setting:

Great!  Now from any Apex code, I can query my Custom Setting for the applicable username and end up with my password.  Within a Managed Package, the Custom Settings created won't be visible to an Org that installs it.  Like they Salesforce page said, the Custom Settings will only be retrievable by the Apex code included in my package.  You can use a SOQL similar to this to get the applicable settings:

Notice the LIMIT?  That's to avoid another red flag that may occur on your AppExchange Security Review findings report.  All queries will need either a WHERE or a LIMIT.

That's all there is to it!  Adapt, clean, and condense this code to fit your needs and enjoy!

May 22, 2013

Salesforce SMS Identify Confirmation

Salesforce has recently rolled out a change that affects the default identity confirmation setting and user confirmation process for your org(s). They began this change in March, issuing a new CRUC (CRitical Update Console - ) that automatically applied May 1st.

Identity Confirmation comes into play when you are logging into an org for the first time either from a new device or from a new location.  When doing so, you'll be prompted with this screen after putting in your password:

Receiving this verification code by email was the default setting.  After the critical update, the default method for receiving the code was by SMS.  After the update was applied, upon logging in again, you may have been prompted with this screen:

That screen will only appear for users that don't have a verified mobile phone listed on their user record.  After providing it and clicking on the "Send me a text message," the user will receive a short numeric code via text message.  If a user doesn't have a mobile phone, they can decline, using the "No thanks..." link and continue to use email verification.

After receiving your code, return to your Salesforce login and enter the code.  You won't have to do this again unless you log in using a new device or from a new location.

Take note that there is a new System Permission that will allow you to enable email-based identity confirmation for users (those w/ verified phone numbers).  This is useful in those scenarios where a user suddenly does not have access to their text messages.  This can be applied within a profile or within a permission set (recommended):

More details from Salesforce here:

Syncing Salesforce Changes to an External System with Future/Schedule Architecture

Many integration projects involve syncing Salesforce data to external systems. For example, you may need to sync contacts and leads along with their related events and tasks to a home-grown portal, or sync financial information to an external accounting system, or sync sales data to a system that allows better mobile experience for certain verticals.

The business logic for some of these projects is very complex, but the general design of the solution usually follow known patterns. However, certain requirements present several considerations that make writing the more common integration code a bit trickier. Here are a few such considerations:
  • Sync events get fired by triggers, but triggers can’t make callouts, so syncing code must run asynchronously (@future).
  • Trigger execution order is not guaranteed, so it is possible that your sync trigger will be fired by a process that is already running asynchronously (which will create a runtime error).
  • Depending on the activity and the number of objects that get synced, you can reach callout limits.
  • Packaged sync applications often clash with other packages that also work asynchronously. Packaged apps also have wide variability of activity, so there is a good chance to hit callouts limits.
  • Any runtime error results in a sync event that is not transmitted to the external system.
  • Any delayed process needs to run soon after the triggering event occurred. Also, a record that is triggered again before its delayed processing occurs, should not be transmitted twice (unless needed by the external system).

To avoid some of these issues, or at least make it less likely for a transaction to be lost, I’ve come up with a design pattern that combines both almost-immediate (future) and delayed (scheduled) methods to communicate with external systems that need to be synced. The basic idea for this design pattern is to always try the immediate processing as the first option, but also allow for a scheduled job to be created if immediate processing is not possible at the time.

Here’s how it’s done… Please note, that I omitted (added as a comment) any business logic that is not important for the design pattern.

First of all, create two new objects to handle scheduled syncing:
  • An object called Synchronize__c to hold Ids of records that need to be synced, as well as any additional information that is needed when processing jobs. For example you may want to save the type of trigger that fired the sync event (insert or update). This object will be queried by a scheduled process, and allow the scheduled processing to reuse the normally triggered sync code.
  • An object called  CronJob__c to hold Cron Job Ids so we can keep track of what jobs were created and be able to remove them after they execute. This is necessary because querying the cron job object does not give you all the info you need to clean up old jobs.

Next, create a schedulable class called SynchronizeSched.cls that will be used to execute scheduled code. Keep it empty until the rest of the code is written:

Next, create a class (Synchronize.cls) to handle all the business logic and special processing. The next four methods will be added to this class, starting with a static class called ProcessFutureCallout that receives a set of Ids. Since the execution is starting from a trigger, specify @future(callout = true) before the method declaration to allow the class to make a callout. This method is what you would normally write to sync records to an external system, and will include any business logic that needs to run before the callout, the actual callout, and the response handling.

Now we can create a method called ScheduleCallout that can handle the delayed sync. This method will be called instead of ProcessFutureCallout when an asynchronous call cannot be made immediately. Within the method, we need to save the Ids of all the records that need to be synced, as well as any other important info, to the newly created object Synchronize__c. Then, create a cron job and save its Id to the new object CronJob__c. Note that in this example, the job is scheduled 1 to 2 minutes into the future. By adding two minutes to now, the scheduled job will run on the second to next whole minute. I think that this is the shortest time period that should be scheduled because if you add just one minute to now(), you run the risk that the code would execute around a whole minute and create a job that cannot be scheduled (You'll get the error "Based on configured schedule, the given trigger will never fire"). Of course, the delayed job can be scheduled for an hourly or daily sync. Not included in the sample, but a good addition, is a quick check that a scheduled job is not already created before creating a new one.

Now create a method ProcessTrigger to be used as the entry point from the trigger. At the beginning of the method, add the normal logic that figures out if you need to process the changed record(s). Once you know which records need to be synced, you can figure out if it is possible to continue processing asynchronously. If it is not possible, you can schedule the changed records for later execution. This method can check for all kind of governor limits and react as needed.
We have to create one more method called ProcessScheduledCallout to handle the scheduled logic. The schedulable class will call this method to run the delayed sync. This method actually does not do much other than get info from the database, call ProcessFutureCallout, and then clean up the database. Since ProcessFutureCallout already has all the logic we need, there is no point rewriting that here.
Lastly, update the schedulable class from the first step to call the ProcessScheduledCallout() method:

May 20, 2013

Parsing JSON \/Date to Salesforce Datetime

If you're working on a lot of custom integrations, you are bound to eventually see dates and datetimes that are represented like this: "\/Date(1198908717056)\/". These dates show up in JSON as strings, and their numeric value represent milliseconds since January 1st 1970 UTC. I believe that this is a Microsoft standard mostly used by .net to represent epoch time.

Here's a quick and useful method that parses these ".net type" datetimes into salesforce datetimes. Simply parse out the numeric value using the parenthesis as guides, cast as long, and use the very useful datetime method newInstance() to calculate the actual date.

May 13, 2013

Reports: Export Details VS Printable View

Ever generate a report, click on the "Export Details' button only to find out that once opened in Excel, the report looks nothing like what you saw in your browser window. What's going on here?

Here's a very simple Contact report grouped by Contact Owner.  You can see there is a distinct grouping by Owner (dark blue bar).

However, when you click on "Export Details," this is what you get:

The "Export Details" button does exactly that... it exports the details of the report; meaning every line that makes up your report will be rendered as a new line in your .xls or .csv. This is regardless of whether or not you have clicked on the "Show Details" or "Hide Details" button. Of course, you could rework the resulting spreadsheet and do the grouping yourself, but why would you do that?

Rather than eagerly clicking on the "Export Details button," take the "Printable View" button for a test drive. It will render your report again as an .xls file, but retain the grouping as you see it on your report.

Here's the same report, exported with the "Printable View" button:

If you want to see the details, click on the "Show Details" button prior to clicking on the "Printable View" button. This will show each line item as well as the summarized headers. If you don't want details, make sure that you've clicked on the "Hide Details" button.

Note that when you choose "Export Details," you are prompted to chose an export file encoding format (ISO, Unicode, etc) and export file format (.csv or .xls).  If using the "Printable View" option, you will always get a .xls file.  Within your spreadsheet editor, you can always choose to save with another file extension and encoding format.

In case you were wondering, this also works for Matrix reports!