Evidence

eDiscovery Case Law: More Sanctions for Fry’s Electronics

 

In E.E.O.C. v Fry’s Electronics, Inc., No. C10-1562RSL, 2012 U.S. Dist. (W.D. Wash. July 3, 2012), Washington District Judge Robert S. Lasnik ordered several sanctions against the defendant in this sexual harassment case (including ordering the defendant to pay $100,000 in monetary sanctions and ordering that certain evidence be considered presumptively admissible at trial), but stopped short of entering a default judgment against the defendant.  This ruling came after having previously ordered sanctions against the defendant less than two months earlier.

Prior Sanctions

On May 10, Judge Lasnik granted in part plaintiffs' motion for sanctions in this case, finding that the defendant had spoliated evidence, including data and computer hard drives. In that ruling, Judge Lasnik believed that the prejudicial effect of the spoliation could be counteracted by “(a) instructing the jury that one of the justifications for firing [one of the plaintiffs] was pretextual and (b) allowing plaintiff considerable leeway in arguing what information might have been gleaned from the computer hard drives had they not been destroyed by defendant”. At the time, Judge Lasnik also indicated “some concern regarding the efficacy and thoroughness of defendant's searches” which led to more information being discovered after he ordered a second search.

Additional Spoliation and Misconduct

During a Rule 30(b)(6) deposition held on May 30, the plaintiffs learned for the first time that the accused individual had previously been accused of sexual harassment in 2001 and that an investigation had been conducted. According to Judge Lasnik, the defendant “intentionally withheld this information and the related documents from discovery by raising unfounded objections and ‘negotiating’ a narrowing of the discovery requests” and found the defendant's conduct to be “unfair, unwarranted, unprincipled, and unacceptable”.

Misconduct by the defendants noted by Judge Lasnik also included the redaction of responsive information, “[e]ven after defendant's objections to certain discovery requests were overruled”, as well as production of hundreds of pages of information with the “fallacious argument” that they were relevant to the claims.

Consideration of Default Judgment Sanction

Judge Lasnik noted that it is “once again left to determine whether to strike defendant's answer and enter default judgment against it”, but noted that dismissal is a “harsh sanction” and the following factors must be considered when determining “whether a dispositive sanction is appropriate under either its inherent powers or Rule 37(b): (1) the public's interest in the expeditious resolution of litigation; (2) the Court's need to manage its docket efficiently and effectively; (3) the risk of prejudice to the party seeking sanctions; (4) the public policy in favor of considering cases on the merits; and (5) the availability of less drastic sanctions.”  While finding that the first three factors supported a dispositive sanction, Judge Lasnik ruled against a dispositive sanction in factor 4, indicating that “[t]he public has an interest in a determination of those issues based on the facts, rather than by judicial fiat”.

Lesser Sanctions Ordered

Instead, Judge Lasnik ordered lesser sanctions, indicating that “Defendant's affirmative defenses related to (i) its efforts to prevent and correct harassment in the workplace, (ii) plaintiffs' failure to utilize protective and corrective opportunities provided by defendant, (iii) its good faith and/or privilege to act as it did in this case are STRICKEN.” He also stated that certain documents and testimony related to “other complaints or reports of sexual harassment” at the company were “presumptively admissible at trial”. He also ordered sanctions of $100,000 “to offset the excess costs caused by defendant’s discovery violations, to punish unacceptable behavior, and as a deterrent to future bad conduct” to be split evenly between the two individual plaintiffs, the EEOC and the Court Clerk.

So, what do you think?  Are you surprised that the defendant didn’t receive a default judgment sanction?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery History: Zubulake’s e-Discovery

 

In the 22 months since this blog began, we have published 133 posts related to eDiscovery case law.  When discussing the various case opinions that involve decisions regarding to eDiscovery, it’s easy to forget that there are real people impacted by these cases and that the story of each case goes beyond just whether they preserved, collected, reviewed and produced electronically stored information (ESI) correctly.  A new book, by the plaintiff in the most famous eDiscovery case ever, provides the “backstory” that goes beyond the precedent-setting opinions of the case, detailing her experiences through the events leading up to the case, as well as over three years of litigation.

Laura A. Zubulake, the plaintiff in the Zubulake vs. UBS Warburg case, has written a new book: Zubulake's e-Discovery: The Untold Story of my Quest for Justice.  It is the story of the Zubulake case – which resulted in one of the largest jury awards in the US for a single plaintiff in an employment discrimination case – as told by the author, in her words.  As Zubulake notes in the Preface, the book “is written from the plaintiff’s perspective – my perspective. I am a businessperson, not an attorney. The version of events and opinions expressed are portrayed by me from facts and circumstances as I perceived them.”  It’s a “classic David versus Goliath story” describing her multi-year struggle against her former employer – a multi-national financial giant.

Zubulake begins the story by developing an understanding of the Wall Street setting of her employer within which she worked for over twenty years and the growing importance of email in communications within that work environment.  It continues through a timeline of the allegations and the evidence that supported those allegations leading up to her filing of a discrimination claim with the Equal Employment Opportunity Commission (EEOC) and her subsequent dismissal from the firm.  This Allegations & Evidence chapter is particularly enlightening to those who may be familiar with the landmark opinions but not the underlying evidence and how that evidence to prove her case came together through the various productions (including the court-ordered productions from backup tapes).  The story continues through the filing of the case and the beginning of the discovery process and proceeds through the events leading up to each of the landmark opinions (with a separate chapter devoted each to Zubulake I, III, IV and V), then subsequently through trial, the jury verdict and the final resolution of the case.

Throughout the book, Zubulake relays her experiences, successes, mistakes, thought processes and feelings during the events and the difficulties and isolation of being an individual plaintiff in a three-year litigation process.  She also weighs in on the significance of each of the opinions, including one ruling by Judge Shira Scheindlin that may not have had as much impact on the outcome as you might think.  For those familiar with the opinions, the book provides the “backstory” that puts the opinions into perspective; for those not familiar with them, it’s a comprehensive account of an individual who fought for her rights against a large corporation and won.  Everybody loves a good “David versus Goliath story”, right?

The book is available at Amazon and also at CreateSpace.  Look for my interview with Laura regarding the book in this blog next week.

So, what do you think?  Are you familiar with the Zubulake opinions?  Have you read the book?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Case Law: Judge Scheindlin Says “No” to Self-Collection, “Yes” to Predictive Coding

 

When most people think of the horrors of Friday the 13th, they think of Jason Voorhees.  When US Immigration and Customs thinks of Friday the 13th horrors, do they think of Judge Shira Scheindlin?

As noted in Law Technology News (Judge Scheindlin Issues Strong Opinion on Custodian Self-Collection, written by Ralph Losey, a previous thought leader interviewee on this blog), New York District Judge Scheindlin issued a decision last Friday (July 13) addressing the adequacy of searching and self-collection by government entity custodians in response to Freedom of Information Act (FOIA) requests.  As Losey notes, this is her fifth decision in National Day Laborer Organizing Network et al. v. United States Immigration and Customs Enforcement Agency, et al., including one that was later withdrawn.

Regarding the defendant’s question as to “why custodians could not be trusted to run effective searches of their own files, a skill that most office workers employ on a daily basis” (i.e., self-collect), Judge Scheindlin responded as follows:

“There are two answers to defendants' question. First, custodians cannot 'be trusted to run effective searches,' without providing a detailed description of those searches, because FOIA places a burden on defendants to establish that they have conducted adequate searches; FOIA permits agencies to do so by submitting affidavits that 'contain reasonable specificity of detail rather than merely conclusory statements.' Defendants' counsel recognize that, for over twenty years, courts have required that these affidavits 'set [ ] forth the search terms and the type of search performed.' But, somehow, DHS, ICE, and the FBI have not gotten the message. So it bears repetition: the government will not be able to establish the adequacy of its FOIA searches if it does not record and report the search terms that it used, how it combined them, and whether it searched the full text of documents.”

“The second answer to defendants' question has emerged from scholarship and caselaw only in recent years: most custodians cannot be 'trusted' to run effective searches because designing legally sufficient electronic searches in the discovery or FOIA contexts is not part of their daily responsibilities. Searching for an answer on Google (or Westlaw or Lexis) is very different from searching for all responsive documents in the FOIA or e-discovery context.”

“Simple keyword searching is often not enough: 'Even in the simplest case requiring a search of on-line e-mail, there is no guarantee that using keywords will always prove sufficient.' There is increasingly strong evidence that '[k]eyword search[ing] is not nearly as effective at identifying relevant information as many lawyers would like to believe.' As Judge Andrew Peck — one of this Court's experts in e-discovery — recently put it: 'In too many cases, however, the way lawyers choose keywords is the equivalent of the child's game of 'Go Fish' … keyword searches usually are not very effective.'”

Regarding search best practices and predictive coding, Judge Scheindlin noted:

“There are emerging best practices for dealing with these shortcomings and they are explained in detail elsewhere. There is a 'need for careful thought, quality control, testing, and cooperation with opposing counsel in designing search terms or keywords to be used to produce emails or other electronically stored information.' And beyond the use of keyword search, parties can (and frequently should) rely on latent semantic indexing, statistical probability models, and machine learning tools to find responsive documents.”

“Through iterative learning, these methods (known as 'computer-assisted' or 'predictive' coding) allow humans to teach computers what documents are and are not responsive to a particular FOIA or discovery request and they can significantly increase the effectiveness and efficiency of searches. In short, a review of the literature makes it abundantly clear that a court cannot simply trust the defendant agencies' unsupported assertions that their lay custodians have designed and conducted a reasonable search.”

Losey notes that “A classic analogy is that self-collection is equivalent to the fox guarding the hen house. With her latest opinion, Schiendlin [sic] includes the FBI and other agencies as foxes not to be trusted when it comes to searching their own email.”

So, what do you think?  Will this become another landmark decision by Judge Scheindlin?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Trends: First Pass Review – Domain Categorization of Your Opponent’s Data

 

Even those of us at eDiscoveryDaily have to take an occasional vacation; however, instead of “going dark” for the week, we thought we would republish a post series from the early days of the blog (when we didn’t have many readers yet)  So chances are, you haven’t seen these posts yet!  Enjoy!

Yesterday, we talked about the use of First Pass Review (FPR) applications (such as FirstPass®, powered by Venio FPR™) to not only conduct first pass review of your own collection, but also to analyze your opponent’s ESI production.  One way to analyze that data is through “fuzzy” searching to find misspellings or OCR errors in an opponent’s produced ESI.

Domain Categorization

Another type of analysis is the use of domain categorization.  Email is generally the biggest component of most ESI collections and each participant in an email communication belongs to a domain associated with the email server that manages their email.

FirstPass supports domain categorization by providing a list of domains associated with the ESI collection being reviewed, with a count for each domain that appears in emails in the collection.  Domain categorization provides several benefits when reviewing your opponent’s ESI:

  • Non-Responsive Produced ESI: Domains in the list that are obviously non-responsive to the case can be quickly identified and all messages associated with those domains can be “group-tagged” as non-responsive.  If a significant percentage of files are identified as non-responsive, that may be a sign that your opponent is trying to “bury you with paper” (albeit electronic).
  • Inadvertent Disclosures: If there are any emails associated with outside counsel’s domain, they could be inadvertent disclosures of attorney work product or attorney-client privileged communications.  If so, you can then address those according to the agreed-upon process for handling inadvertent disclosures and clawback of same.
  • Issue Identification: Messages associated with certain parties might be related to specific issues (e.g., an alleged design flaw of a specific subcontractor’s product), so domain categorization can isolate those messages more quickly.

In summary, there are several ways to use first pass review tools, like FirstPass, for reviewing your opponent’s ESI production, including: email analytics, synonym searching, fuzzy searching and domain categorization.  First pass review isn’t just for your own production; it’s also an effective process to quickly evaluate your opponent’s production.

So, what do you think?  Have you used first pass review tools to assess an opponent’s produced ESI?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Trends: First Pass Review – Fuzzy Searching Your Opponent’s Data

 

Even those of us at eDiscoveryDaily have to take an occasional vacation; however, instead of “going dark” for the week, we thought we would republish a post series from the early days of the blog (when we didn’t have many readers yet)  So chances are, you haven’t seen these posts yet!  Enjoy!

Tuesday, we talked about the use of First Pass Review (FPR) applications (such as FirstPass®, powered by Venio FPR™) to not only conduct first pass review of your own collection, but also to analyze your opponent’s ESI production.  One way to analyze that data is through synonym searching to find variations of your search terms to increase the possibility of finding the terminology used by your opponents.

Fuzzy Searching

Another type of analysis is the use of fuzzy searching.  Attorneys know what terms they’re looking for, but those terms may not often be spelled correctly.  Also, opposing counsel may produce a number of image only files that require Optical Character Recognition (OCR), which is usually not 100% accurate.

FirstPass supports "fuzzy" searching, which is a mechanism by finding alternate words that are close in spelling to the word you're looking for (usually one or two characters off).  FirstPass will display all of the words – in the collection – close to the word you’re looking for, so if you’re looking for the term “petroleum”, you can find variations such as “peroleum”, “petoleum” or even “petroleom” – misspellings or OCR errors that could be relevant.  Then, simply select the variations you wish to include in the search.  Fuzzy searching is the best way to broaden your search to include potential misspellings and OCR errors and FirstPass provides a terrific capability to select those variations to review additional potential “hits” in your collection.

Tomorrow, I’ll talk about the use of domain categorization to quickly identify potential inadvertent disclosures and weed out non-responsive files produced by your opponent, based on the domain of the communicators.  Hasta la vista, baby! J

In the meantime, what do you think?  Have you used fuzzy searching to find misspellings or OCR errors in an opponent’s produced ESI?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Trends: First Pass Review – Synonym Searching Your Opponent’s Data

 

Even those of us at eDiscoveryDaily have to take an occasional vacation; however, instead of “going dark” for the week, we thought we would republish a post series from the early days of the blog (when we didn’t have many readers yet)  So chances are, you haven’t seen these posts yet!  Enjoy!

Yesterday, we talked about the use of First Pass Review (FPR) applications (such as FirstPass®, powered by Venio FPR™) to not only conduct first pass review of your own collection, but also to analyze your opponent’s ESI production.  One way to analyze that data is through email analytics to see the communication patterns graphically to identify key parties for deposition purposes and look for potential production omissions.

Synonym Searching

Another type of analysis is the use of synonym searching.  Attorneys understand the key terminology their client uses, but they often don’t know the terminology their client’s opposition uses because they haven’t interviewed the opposition’s custodians.  In a product defect case, the opposition may refer to admitted design or construction “mistakes” in their product or process as “flaws”, “errors”, “goofs” or even “flubs”.  With FirstPass, you can enter your search term into the synonym searching section of the application and it will provide a list of synonyms (with hit counts of each, if selected).  Then, you can simply select the synonyms you wish to include in the search.  As a result, FirstPass identifies synonyms of your search terms to broaden the scope and catch key “hits” that could be the “smoking gun” in the case.

Thursday, I’ll talk about the use of fuzzy searching to find misspellings that may be commonly used by your opponent or errors resulting from Optical Character Recognition (OCR) of any image-only files that they produce.  Stay tuned!  🙂

In the meantime, what do you think?  Have you used synonym searching to identify variations on terms in an opponent’s produced ESI?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Happy Independence Day from all of us at eDiscovery Daily and CloudNine Discovery!

eDiscovery Trends: First Pass Review – of Your Opponent’s Data

 

Even those of us at eDiscoveryDaily have to take an occasional vacation; however, instead of “going dark” for the week, we thought we would republish a post series from the early days of the blog (when we didn’t have many readers yet)  So chances are, you haven’t seen these posts yet!  Enjoy!

In the past few years, applications that support Early Case Assessment (ECA) (or Early Data Assessment, as many prefer to call it) and First Pass Review (FPR) of ESI have become widely popular in eDiscovery as the analytical and culling benefits of conducting FPR have become obvious.  The benefit of these FPR tools to analyze and cull their ESI before conducting attorney review and producing relevant files has become increasingly clear.  But, nobody seems to talk about what these tools can do with opponent’s produced ESI.

Less Resources to Understand Data Produced to You

In eDiscovery, attorneys typically develop a reasonably in-depth understanding of their collection.  They know who the custodians are, have a chance to interview those custodians and develop a good knowledge of standard operating procedures and terminology of their client to effectively retrieve responsive ESI.  However, that same knowledge isn’t present when reviewing opponent’s data.  Unless they are deposed, the opposition’s custodians aren’t interviewed and where the data originated is often unclear.  The only source of information is the data itself, which requires in-depth analysis.  An FPR application like FirstPass®, powered by Venio FPR™, can make a significant difference in conducting that analysis – provided that you request a native production from your opponent, which is vital to being able to perform that in-depth analysis.

Email Analytics

The ability to see the communication patterns graphically – to identify the parties involved, with whom they communicated and how frequently – is a significant benefit to understanding the data received.  FirstPass provides email analytics to understand the parties involved and potentially identify other key opponent individuals to depose in the case.  Dedupe capabilities enable quick comparison against your production to confirm if the opposition has possibly withheld key emails between opposing parties.  FirstPass also provides an email timeline to enable you to determine whether any gaps exist in the opponent’s production.

Message Threading

The ability to view message threads for emails (which Microsoft Outlook® tracks), can also be a useful tool as it enables you to see the entire thread “tree” of a conversation, including any side discussions that break off from the original discussion.  Because Outlook tracks those message threads, any missing emails are identified with placeholders.  Those could be emails your opponent has withheld, so the ability to identify those quickly and address with opposing counsel (or with the court, if necessary) is key to evaluating the completeness of the production.

Tomorrow, I’ll talk about the use of synonym searching to find variations of your search terms that may be common terminology of your opponent.  Same bat time, same bat channel! 🙂

In the meantime, what do you think?  Have you used email analytics to analyze an opponent’s produced ESI?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Case Law: “Naked” Assertions of Spoliation Are Not Enough to Grant Spoliation Claims

 

In Grabenstein v. Arrow Electronics, Inc., No. 10-cv-02348-MSK-KLM, 2012 U.S. Dist. LEXIS 56204 (D. Colo. Apr.23, 2012), Colorado Magistrate Judge Kristen L. Mix denied the plaintiff’s motion for sanctions, finding that their claims of spoliation were based on “naked” assertions that relevant eMails must exist even though the plaintiff could not demonstrate that such other eMails do or did exist.  The motion was also denied because the plaintiff could not establish when the defendant had deleted certain eMail messages, thereby failing to prove claims that the defendant violated its duty to preserve electronic evidence. Judge Mix noted that sanctions are not justified when documents are destroyed in good faith pursuant to a reasonable records-retention policy, if that’s prior to the duty to preserve such documents.

In this employment discrimination case, the plaintiff filed a motion for sanctions, claiming that the defendant failed to retain all eMail messages exchanged internally as well as between the defendant and the plaintiff’s insurer, MetLife, regarding the plaintiff’s short-term disability leave.

Defining the requirement for a finding of spoliation, Judge Mix stated, “A spoliation sanction is proper where (1) a party has a duty to preserve evidence because it knew, or should have known, that litigation was imminent, and (2) the adverse party was prejudiced by the destruction of the evidence.”

Here, Judge Mix found the plaintiff’s contentions that relevant eMails were missing to be “fatally unclear” since neither the plaintiff nor the defendant knew whether other such eMails existed. The plaintiff was also unable to provide any verification that MetLife’s log of relevant eMails exchanged with the defendant was incomplete or had been altered. As a result, Judge Mix was “unable to find that the e-mails produced by MetLife are incomplete and that Defendant destroyed the only complete versions of those e-mails”.

There were some eMails which the defendant admittedly did not preserve.  As to whether those eMails had been deleted after the duty to preserve them had arisen, Judge Mix discussed the standard under the spoliation doctrine: “‘[I]n most cases, the duty to preserve evidence is triggered by the filing of a lawsuit. However, the obligation to preserve evidence may arise even earlier if a party has notice that future litigation is likely.’” Here, Judge Mix found that the plaintiff had not produced any evidence that the defendant should have anticipated litigation prior to receiving actual notice of the filing of the lawsuit. The plaintiff was also unable to show any evidence at all when the defendant had destroyed the eMails that would rebut the defendant’s attorney’s statement that the eMails were deleted prior to the start of litigation. As a result, the plaintiff did not meet its burden of establishing that the defendant had violated its duty to preserve.

While finding that the defendants had violated a records retention policy regulation applicable to the Equal Employment Opportunity Commission when it deleted the eMails, Judge Mix found that it had not done so in bad faith, and it had been simply following its own eMail retention policy in the normal course of business. Accordingly, the plaintiff’s motion for sanctions was denied.

So, what do you think?  Was the ruling fair or should the defendants have been sanctioned for the deleted eMails?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Case Law: Plaintiff Compelled to Produce Mirror Image of Drives Despite Defendant’s Initial Failure to Request Metadata

 

In Commercial Law Corp., P.C. v. FDIC, No. 10-13275, 2012 U.S. Dist. LEXIS 51437 (E.D. Mich. Apr. 12, 2012), Michigan District Judge Sean F. Cox ruled that a party can be compelled to produce a mirror image of its computer drives using a neutral third-party expert where metadata is relevant and the circumstances dictate it, even though the requesting party initially failed to request that metadata and specify the format of documents in its first discovery request.

The plaintiff was an attorney who sought to recover fees from the FDIC for services in its capacity as receiver for a bank. The plaintiff claimed that it held valid liens on properties of the bank, and provided an eMail to the bank as evidence. The FDIC disputed the plaintiff’s claim, contended that she was lying and sought to compel her to produce a mirror image of her computer drives to examine relevant data pertaining to the lien documents. Magistrate Judge R. Steven Whalen ordered the plaintiff to compel, and the plaintiff objected.

Judge Cox ruled that there was a proper basis for ordering an exact copy of her drives to be created and also agreed that it was appropriate to be performed by a neutral third-party expert, finding:

  • That such an examination would reveal relevant information pursuant to Rule 26 because “[t]he date Plaintiff executed the security lien is clearly relevant to a defense against Plaintiff’s attorney lien claim”;
  • That there were a number of factors that gave the defendant “sufficient cause for concern” as to the authenticity of the lien documents, shooting down the plaintiff’s claim that the court was simply following a “hunch”;
  • That a third-party expert is an appropriate way to execute the examination.

Despite the fact that the defendant did not request metadata nor specify the format of the documents in its initial discovery request, Judge Cox permitted an expert to obtain relevant metadata. Judge Cox noted:

“It is clear from the parties’ pleadings that Defendant’s concern regarding the legitimacy of the lien documents intensified during the course of discovery. Specifically, Defendant did not obtain the January 18, 2010 email [claiming the lien documents were attached] until it deposed Karl Haiser in August of 2011, well after it submitted its first discovery requests to Plaintiff. “

As a result, the plaintiff’s objections to the Magistrate Judge Whalen’s order were overruled.

So, what do you think?  Should the defendant have been granted another opportunity at the metadata or should the plaintiff’s objections have been granted?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Trends: X1 Social Discovery – Social Media Discovery for Professionals

 

According to EDDUpdate.com, social media will be eclipsing email as the primary discovery resource within three years.  Social media has become a normal part of our everyday life as we share our photos on Facebook, tweet news on Twitter, and make professional connections on LinkedIn.  We’ve previously covered social media archiving tools here, highlighting a firm named Smarsh, and the need for effective electronic discovery methods is only growing by the day.  As you can imagine, the sheer amount of content being generated is astounding.  Twitter CEO Dick Costolo announced on June 6th that Twitter had broken the 400 million tweet-per-day barrier, up 18% from 340 million back in March.  These aren’t simply meaningless ones and zeroes, either. X1 Discovery has information for 689 cases related to social media discovery from 2010 and 2011 linked on their website, making it clear just how many cases are being affected by social media these days.

With regard to ESI on social media networks, X1 Discovery features a solution called X1 Social Discovery, which is described as “the industry's first investigative solution specifically designed to enable eDiscovery and computer forensics professionals to effectively address social media content.  X1 Social Discovery provides for a powerful platform to collect, authenticate, search, review and produce electronically stored information (ESI) from popular social media sites, such as Facebook, Twitter and LinkedIn.”

We reached out to X1 Discovery for more information about X1 Social Discovery, especially with regard as to what sort of challenges faces a new tool developed for a new type of information.  For example, why isn’t support for Google+, Google’s fledgling social network, offered?  X1 Discovery Executive Vice President for Sales and Business Development, Skip Lindsey, addressed that question accordingly:

“Our system can be purposed to accommodate a wide variety of use cases and we are constantly working with clients to understand their requirements to further enhance the product.  As you are aware there are a staggering number of potential social media systems to be collected from, but in terms of frequency of use, Facebook, Twitter and Linkedin are far and away the most prominent and there is a lot of constant time and attention we provide to ensure the accuracy and completeness of the data we obtain from those sites. We use a combination of direct API’s to the most popular systems, and have incorporated comprehensive web crawling and single page web capture into X1 Social Discovery to allow capture of virtually any web source that the operator can access. Google + is on the roadmap and we plan support in the near future.”

So, who is going to benefit most from X1 Social Discovery, and how is it different than an archiving tool like Smarsh?  According to Lindsay:

“X1 Social Discovery is installable software, not a service. This means that clients can deploy quickly and do not incur any additional usage charges for case work. Our investigative interface and workflow are unique in our opinion and better suited to professional investigators, law enforcement and eDiscovery professionals that other products that we have seen which work with social media content. Many of these other systems were created for the purpose of compliance archiving of web sites and do not address the investigation and litigation support needs of our client base. We feel that the value proposition of X1 Social Discovery is hard to beat in terms of its functionality, defensibility, and cost of ownership.”

With so many cases requiring collection by experienced professionals these days, it seems appropriate that there’s a tool like X1 Social Discovery designed for them for collecting social media ESI.

So, what do you think?  Do you collect your own social media ESI or do you use experienced professionals for this collection?  What tools have you used?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.