Collection

eDiscovery Case Law: Judge Scheindlin Says “No” to Self-Collection, “Yes” to Predictive Coding

 

When most people think of the horrors of Friday the 13th, they think of Jason Voorhees.  When US Immigration and Customs thinks of Friday the 13th horrors, do they think of Judge Shira Scheindlin?

As noted in Law Technology News (Judge Scheindlin Issues Strong Opinion on Custodian Self-Collection, written by Ralph Losey, a previous thought leader interviewee on this blog), New York District Judge Scheindlin issued a decision last Friday (July 13) addressing the adequacy of searching and self-collection by government entity custodians in response to Freedom of Information Act (FOIA) requests.  As Losey notes, this is her fifth decision in National Day Laborer Organizing Network et al. v. United States Immigration and Customs Enforcement Agency, et al., including one that was later withdrawn.

Regarding the defendant’s question as to “why custodians could not be trusted to run effective searches of their own files, a skill that most office workers employ on a daily basis” (i.e., self-collect), Judge Scheindlin responded as follows:

“There are two answers to defendants' question. First, custodians cannot 'be trusted to run effective searches,' without providing a detailed description of those searches, because FOIA places a burden on defendants to establish that they have conducted adequate searches; FOIA permits agencies to do so by submitting affidavits that 'contain reasonable specificity of detail rather than merely conclusory statements.' Defendants' counsel recognize that, for over twenty years, courts have required that these affidavits 'set [ ] forth the search terms and the type of search performed.' But, somehow, DHS, ICE, and the FBI have not gotten the message. So it bears repetition: the government will not be able to establish the adequacy of its FOIA searches if it does not record and report the search terms that it used, how it combined them, and whether it searched the full text of documents.”

“The second answer to defendants' question has emerged from scholarship and caselaw only in recent years: most custodians cannot be 'trusted' to run effective searches because designing legally sufficient electronic searches in the discovery or FOIA contexts is not part of their daily responsibilities. Searching for an answer on Google (or Westlaw or Lexis) is very different from searching for all responsive documents in the FOIA or e-discovery context.”

“Simple keyword searching is often not enough: 'Even in the simplest case requiring a search of on-line e-mail, there is no guarantee that using keywords will always prove sufficient.' There is increasingly strong evidence that '[k]eyword search[ing] is not nearly as effective at identifying relevant information as many lawyers would like to believe.' As Judge Andrew Peck — one of this Court's experts in e-discovery — recently put it: 'In too many cases, however, the way lawyers choose keywords is the equivalent of the child's game of 'Go Fish' … keyword searches usually are not very effective.'”

Regarding search best practices and predictive coding, Judge Scheindlin noted:

“There are emerging best practices for dealing with these shortcomings and they are explained in detail elsewhere. There is a 'need for careful thought, quality control, testing, and cooperation with opposing counsel in designing search terms or keywords to be used to produce emails or other electronically stored information.' And beyond the use of keyword search, parties can (and frequently should) rely on latent semantic indexing, statistical probability models, and machine learning tools to find responsive documents.”

“Through iterative learning, these methods (known as 'computer-assisted' or 'predictive' coding) allow humans to teach computers what documents are and are not responsive to a particular FOIA or discovery request and they can significantly increase the effectiveness and efficiency of searches. In short, a review of the literature makes it abundantly clear that a court cannot simply trust the defendant agencies' unsupported assertions that their lay custodians have designed and conducted a reasonable search.”

Losey notes that “A classic analogy is that self-collection is equivalent to the fox guarding the hen house. With her latest opinion, Schiendlin [sic] includes the FBI and other agencies as foxes not to be trusted when it comes to searching their own email.”

So, what do you think?  Will this become another landmark decision by Judge Scheindlin?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Best Practices: Quality Assurance vs. Quality Control and Why Both Are Important in eDiscovery

 

People tend to use the terms Quality Assurance (QA) and Quality Control (QC) interchangeably and it’s a pet peeve of mine.  It’s like using the word “irregardless” – which isn’t really a word.  The fact is that QA and QC are different mechanisms for ensuring quality in…anything.  Products, processes and projects (as well as things that don’t begin with “pro”) are all examples of items that can benefit from quality ensuring mechanisms and those that are related to electronic discovery can particularly benefit.

First, let’s define terms

Quality Assurance (QA) can be defined as planned and systematic activities and mechanisms implemented so that quality requirements for a product or service will be fulfilled.

Quality Control, (QC) can be defined as one or more processes to review the quality of all factors involved in that product or service.

Now, let’s apply the terms to an example in eDiscovery

CloudNine Discovery’s flagship product is OnDemand®, which is an online eDiscovery review application.  It’s easy to use and the leader in self-service, online eDiscovery review (sorry, I’m the marketing director, I can’t help myself).

OnDemand has a team of developers, who use a variety of Quality Assurance mechanisms to ensure the quality of the application.  They include (but are not limited to):

  • Requirements meetings with stakeholders to ensure that all required functionality for each component is clearly defined;
  • Development team “huddles” to discuss progress and to learn from each other’s good development ideas;
  • Back end database and search engine that establish rules for data and searching that data (so, for example, the valid values for whether or not a document is responsive are “True” and “False” and not “Purple”) and;
  • Code management software to keep versions of development code to ensure the developers don’t overwrite each other’s work.

Quality Control mechanisms for OnDemand include:

  • Test plan creation to identify all functional areas of the application that need to be tested;
  • Rigorous testing of all functionality within each software release by a team of software testers;
  • Issue tracking software to track all problems found in testing that allows for assignment to responsible developers and tracking through to completion to address the issue and re-testing to confirm the issue has been adequately addressed;
  • Beta testing by selected clients interested in using the latest new features and willing to provide feedback as to how well those features work and how well they meet their needs.

These QA and QC mechanisms help ensure that OnDemand works correctly and that it provides the functionality required by our clients.  And, we continue to work to make those mechanisms even more effective.

QA & QC mechanisms aren’t just limited to eDiscovery software.  Take the process of conducting attorney review to determine responsiveness and privilege.  QA mechanisms include instructions and background information provided to reviewers up front to get them up to speed on the review process, periodic “huddles” for additional instructions and discussion amongst reviewers to share best practices, assignment of “batches” so that each document is reviewed by one, and only one, reviewer and validation rules to ensure that entries are recorded correctly.  QC mechanisms include a second review (usually by a review supervisor or senior attorney) to ensure that documents are being categorized correctly and metrics reports to ensure that the review team can meet deadlines while still conducting a thorough review.  QA & QC mechanisms can also be applied to preservation, collection, searching and production (among other eDiscovery activities) and they are critical to enabling discovery obligations to be met.

So, what do you think?  What QA & QC mechanisms do you use in your eDiscovery processes?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Best Practices: You May Need to Collect from Custodians Who Aren’t There

 

A little over a week ago, we talked about how critical the first seven to ten days are in the case once litigation hits.  Key activities to get a jump on the case include creating a list of key employees most likely to have documents relevant to the litigation and interviewing those key employees, as well as key department representatives, such as IT for information about retention and destruction policies.  These steps are especially important as they may shed light on custodians you might not think about – the ones who aren’t there.

No, I’m not talking about the Coen brothers’ movie The Man Who Wasn’t There, starring Billy Bob Thornton, I’m talking about custodians who are no longer with the organization.

Let’s face it, when key employees depart an organization, many of those organizations have a policy in place to preserve their data for a period of time to ensure that any data in their possession that might be critical to company operations is still available if needed.  Preserving that data may occur in a number of ways, including:

  • Saving the employee’s hard drive, either by keeping the drive itself or by backing it up to some other media before wiping it for re-use;
  • Keeping any data in their network store (i.e., folder on the network dedicated to the employee’s files) by backing up that folder or even (in some cases) simply leaving it there for access if needed;
  • Storage and/or archival of eMail from the eMail system;
  • Retention of any portable media in the employee’s possession (including DVDs, portable hard drives, PDAs, cell phones, etc.).

As part of the early fact finding, it’s essential to determine the organization’s retention policy (and practices, especially if there’s no formal policy) for retaining data (such as the examples listed above) of departed employees.  You need to find out if the organization keeps that data, where they keep it, in what format, and for how long.

When interviewing key employees, one of the typical questions to ask is “Do you know of any other employees that may have responsive data to this litigation?”  The first several interviews with employees often identify other employees that need to be interviewed, so the interview list will often grow to locate potentially responsive electronically stored information (ESI).  It’s important to broaden that question to include employees that are no longer with the organization to identify any that also may have had responsive data and try to gather as much information about each departed employee as possible, including the department in which they worked, who their immediate supervisor was and how long they worked at the company.  Often, this information may need to be gathered from Human Resources.

Once you’ve determined which departed employees might have had responsive data and whether the organization may still be retaining any of that data, you can work with IT or whoever has possession of that data to preserve and collect it for litigation purposes.  Just because they aren’t there doesn’t mean they’re not important.

So, what do you think?  Does your approach for identifying and collecting from custodians include those who aren’t there?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Best Practices: When Litigation Hits, The First 7 to 10 Days is Critical

When a case is filed, several activities must be completed within a short period of time (often as soon as the first seven to ten days after filing) to enable you to assess the scope of the case, where the key electronically stored information (ESI) is located and whether to proceed with the case or attempt to settle with opposing counsel.  Here are several of the key early activities that can assist in deciding whether to litigate or settle the case.

Activities:

  • Create List of Key Employees Most Likely to have Documents Relevant to the Litigation: To estimate the scope of the case, it’s important to begin to prepare the list of key employees that may have potentially responsive data.  Information such as name, title, eMail address, phone number, office location and where information for each is stored on the network is important to be able to proceed quickly when issuing hold notices and collecting their data.
  • Issue Litigation Hold Notice and Track Results: The duty to preserve begins when you anticipate litigation; however, if litigation could not be anticipated prior to the filing of the case, it is certainly clear once the case if filed that the duty to preserve has begun.  Hold notices must be issued ASAP to all parties that may have potentially responsive data.  Once the hold is issued, you need to track and follow up to ensure compliance.  Here are a couple of recent posts regarding issuing hold notices and tracking responses.
  • Interview Key Employees: As quickly as possible, interview key employees to identify potential locations of responsive data in their possession as well as other individuals they can identify that may also have responsive data so that those individuals can receive the hold notice and be interviewed.
  • Interview Key Department Representatives: Certain departments, such as IT, Records or Human Resources, may have specific data responsive to the case.  They may also have certain processes in place for regular destruction of “expired” data, so it’s important to interview them to identify potentially responsive sources of data and stop routine destruction of data subject to litigation hold.
  • Inventory Sources and Volume of Potentially Relevant Documents: Potentially responsive data can be located in a variety of sources, including: shared servers, eMail servers, employee workstations, employee home computers, employee mobile devices, portable storage media (including CDs, DVDs and portable hard drives), active paper files, archived paper files and third-party sources (consultants and contractors, including cloud storage providers).  Hopefully, the organization already has created a data map before litigation to identify the location of sources of information to facilitate that process.  It’s important to get a high level sense of the total population to begin to estimate the effort required for discovery.
  • Plan Data Collection Methodology: Determining how each source of data is to be collected also affects the cost of the litigation.  Are you using internal resources, outside counsel or a litigation support vendor?  Will the data be collected via an automated collection system or manually?  Will employees “self-collect” any of their own data?  Answers to these questions will impact the scope and cost of not only the collection effort, but the entire discovery effort.

These activities can result in creating a data map of potentially responsive information and a “probable cost of discovery” spreadsheet (based on initial estimated scope compared to past cases at the same stage) that will help in determining whether to proceed to litigate the case or attempt to settle with the other side.

So, what do you think?  How quickly do you decide whether to litigate or settle?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Best Practices: Documentation is Key to a Successful Discovery Effort

 

We like to point out good articles about eDiscovery on this blog to keep our readers aware of trends and best practices.  I recently read an article on InsideCounsel titled E-discovery: Memorializing the e-discovery process, written by Alvin Lindsay, which had some good specific examples of where good documentation is important to prevent sanctions and save litigation costs.

Litigation Holds

The author notes that, since the Zubulake opinions issued by Judge Shira Scheindlin in 2003 and 2004, 1) most jurisdictions have come to expect that parties must issue a litigation hold “as soon as litigation becomes reasonably foreseeable”, and 2) “oral” litigation holds are unlikely to be sufficient since the same Judge Scheindlin noted in Pension Committee that failure to issue a “written” litigation hold constitutes “gross negligence”.  His advice: “make sure the litigation hold is in writing, and includes at minimum the date of issue, the recipients and the scope of preservation”.  IT personnel responsible for deleting “expired” data (outside of retention policies) also need to receive litigation hold documentation; in fact, “it can be a good idea to provide a separate written notice order just for them”.  Re-issuing the hold notices periodically is important because, well, people forget if they’re not reminded.  For previous posts on the subject of litigation holds, click here and here.

Retention Policies and Data Maps

Among the considerations for documentation here are the actual retention and destruction policies, system-wide backup procedures and “actual (as opposed to theoretical) implementation of the firm’s recycle policy”, as well as documentation of discussions with any personnel regarding same.  A data map provides a guide for legal and IT to the location of data throughout the company and important information about that data, such as the business units, processes and technology responsible for maintaining the data, as well as retention periods for that data.  The author notes that many organizations “don’t keep data maps in the ordinary course of business, so outside counsel may have to create one to truly understand their client’s data retention architecture.”  Creating a data map is impossible for outside counsel without involvement and assistance at several levels within the organization, so it’s truly a group effort and best done before litigation strikes.  For previous posts on the subject of data maps, click here and here.

Conferences with Opposing Counsel

The author discusses the importance of documenting the nature and scope of preservation and production and sums up the importance quite effectively by stating: “If opposing parties who are made aware of limitations early on do not object in a timely fashion to what a producing party says it will do, courts will be more likely to invoke the doctrines of waiver and estoppel when those same parties come to complain of supposed production infirmities on the eve of trial.”  So, the benefits of documenting those limitations early on are clear.

Collecting, Culling and Sampling

Chain of custody documentation (as well as a through written explanation of the collection process) is important to demonstrating integrity of the data being collected.  If you collect at a broad level (as many do), then you need to cull through effective searching to identify potentially responsive ESI.  Documenting the approach for searching as well as the searches themselves is key to a defensible searching and culling process (it helps when you use an application, like FirstPass®, powered by Venio FPR™, that keeps a history of all searches performed).  As we’ve noted before, sampling enables effective testing and refinement of searches and aids in the defense of the overall search approach.

Quality Control

And, of course, documenting all materials and mechanisms used to provide quality assurance and control (such as “materials provided to and used to train the document reviewers, as well as the results of QC checks for each reviewer”) make it easier to defend your approach and even “clawback” privileged documents if you can show that your approach was sound.  Mistakes happen, even with the best of approaches.

So, what do you think?  These are some examples of important documentation of the eDiscovery process – can you think of others?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Case Law: Plaintiff Compelled to Produce Mirror Image of Drives Despite Defendant’s Initial Failure to Request Metadata

 

In Commercial Law Corp., P.C. v. FDIC, No. 10-13275, 2012 U.S. Dist. LEXIS 51437 (E.D. Mich. Apr. 12, 2012), Michigan District Judge Sean F. Cox ruled that a party can be compelled to produce a mirror image of its computer drives using a neutral third-party expert where metadata is relevant and the circumstances dictate it, even though the requesting party initially failed to request that metadata and specify the format of documents in its first discovery request.

The plaintiff was an attorney who sought to recover fees from the FDIC for services in its capacity as receiver for a bank. The plaintiff claimed that it held valid liens on properties of the bank, and provided an eMail to the bank as evidence. The FDIC disputed the plaintiff’s claim, contended that she was lying and sought to compel her to produce a mirror image of her computer drives to examine relevant data pertaining to the lien documents. Magistrate Judge R. Steven Whalen ordered the plaintiff to compel, and the plaintiff objected.

Judge Cox ruled that there was a proper basis for ordering an exact copy of her drives to be created and also agreed that it was appropriate to be performed by a neutral third-party expert, finding:

  • That such an examination would reveal relevant information pursuant to Rule 26 because “[t]he date Plaintiff executed the security lien is clearly relevant to a defense against Plaintiff’s attorney lien claim”;
  • That there were a number of factors that gave the defendant “sufficient cause for concern” as to the authenticity of the lien documents, shooting down the plaintiff’s claim that the court was simply following a “hunch”;
  • That a third-party expert is an appropriate way to execute the examination.

Despite the fact that the defendant did not request metadata nor specify the format of the documents in its initial discovery request, Judge Cox permitted an expert to obtain relevant metadata. Judge Cox noted:

“It is clear from the parties’ pleadings that Defendant’s concern regarding the legitimacy of the lien documents intensified during the course of discovery. Specifically, Defendant did not obtain the January 18, 2010 email [claiming the lien documents were attached] until it deposed Karl Haiser in August of 2011, well after it submitted its first discovery requests to Plaintiff. “

As a result, the plaintiff’s objections to the Magistrate Judge Whalen’s order were overruled.

So, what do you think?  Should the defendant have been granted another opportunity at the metadata or should the plaintiff’s objections have been granted?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Trends: X1 Social Discovery – Social Media Discovery for Professionals

 

According to EDDUpdate.com, social media will be eclipsing email as the primary discovery resource within three years.  Social media has become a normal part of our everyday life as we share our photos on Facebook, tweet news on Twitter, and make professional connections on LinkedIn.  We’ve previously covered social media archiving tools here, highlighting a firm named Smarsh, and the need for effective electronic discovery methods is only growing by the day.  As you can imagine, the sheer amount of content being generated is astounding.  Twitter CEO Dick Costolo announced on June 6th that Twitter had broken the 400 million tweet-per-day barrier, up 18% from 340 million back in March.  These aren’t simply meaningless ones and zeroes, either. X1 Discovery has information for 689 cases related to social media discovery from 2010 and 2011 linked on their website, making it clear just how many cases are being affected by social media these days.

With regard to ESI on social media networks, X1 Discovery features a solution called X1 Social Discovery, which is described as “the industry's first investigative solution specifically designed to enable eDiscovery and computer forensics professionals to effectively address social media content.  X1 Social Discovery provides for a powerful platform to collect, authenticate, search, review and produce electronically stored information (ESI) from popular social media sites, such as Facebook, Twitter and LinkedIn.”

We reached out to X1 Discovery for more information about X1 Social Discovery, especially with regard as to what sort of challenges faces a new tool developed for a new type of information.  For example, why isn’t support for Google+, Google’s fledgling social network, offered?  X1 Discovery Executive Vice President for Sales and Business Development, Skip Lindsey, addressed that question accordingly:

“Our system can be purposed to accommodate a wide variety of use cases and we are constantly working with clients to understand their requirements to further enhance the product.  As you are aware there are a staggering number of potential social media systems to be collected from, but in terms of frequency of use, Facebook, Twitter and Linkedin are far and away the most prominent and there is a lot of constant time and attention we provide to ensure the accuracy and completeness of the data we obtain from those sites. We use a combination of direct API’s to the most popular systems, and have incorporated comprehensive web crawling and single page web capture into X1 Social Discovery to allow capture of virtually any web source that the operator can access. Google + is on the roadmap and we plan support in the near future.”

So, who is going to benefit most from X1 Social Discovery, and how is it different than an archiving tool like Smarsh?  According to Lindsay:

“X1 Social Discovery is installable software, not a service. This means that clients can deploy quickly and do not incur any additional usage charges for case work. Our investigative interface and workflow are unique in our opinion and better suited to professional investigators, law enforcement and eDiscovery professionals that other products that we have seen which work with social media content. Many of these other systems were created for the purpose of compliance archiving of web sites and do not address the investigation and litigation support needs of our client base. We feel that the value proposition of X1 Social Discovery is hard to beat in terms of its functionality, defensibility, and cost of ownership.”

With so many cases requiring collection by experienced professionals these days, it seems appropriate that there’s a tool like X1 Social Discovery designed for them for collecting social media ESI.

So, what do you think?  Do you collect your own social media ESI or do you use experienced professionals for this collection?  What tools have you used?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Trends: Where Does the Money Go? RAND Provides Some Answers

 

The RAND Corporation, a nonprofit research and analysis institution recently published a new 159 page report related to understanding eDiscovery costs entitled Where the Money Goes: Understanding Litigant Expenditures for Producing Electronic Discovery by Nicholas M. Pace and Laura Zakaras that has some interesting findings and recommendations.  To obtain either a paperback copy or download a free eBook of the report, click here.

For the study, the authors requested case-study data from eight Fortune 200 companies and obtained data for 57 large-volume eDiscovery productions (from both traditional lawsuits and regulatory investigations) as well as information from extensive interviews with key legal personnel from the participating companies.  Here are some of the key findings from the research:

  • Review Makes Up the Largest Percentage of eDiscovery Production Costs: By a whopping amount, the major cost component in their cases was the review of documents for relevance, responsiveness, and privilege (typically about 73 percent). Collection, on the other hand, only constituted about 8 percent of expenditures for the cases in the study, while processing costs constituted about 19 percent in the cases.  It costs about $14,000 to review each gigabyte and $20,000 in total production costs for each gigabyte (click here for a previous study on per gigabyte costs).  Review costs would have to be reduced by about 75% in order to make those costs comparable to processing, the next highest component.
  • Outside Counsel Makes Up the Largest Percentage of eDiscovery Expenditures: Again, by a whopping amount, the major cost component was expenditures for outside counsel services, which constituted about 70 percent of total eDiscovery production costs.  Vendor expenditures were around 26 percent.  Internal expenditures, even with adjustments made for underreporting, were generally around 4 percent of the total.  So, almost all eDiscovery expenditures are outsourced in one way or another.
  • If Conducted in the Traditional Manner, Review Costs Are Difficult to Reduce Significantly: Rates currently paid to “project attorneys during large-scale reviews in the US may well have bottomed out” and foreign review teams are often not a viable option due to “issues related to information security, oversight, maintaining attorney-client privilege, and logistics”.  Increasing the rate of review is also limited as, “[g]iven the trade-off between reading speed and comprehension…it is unrealistic to expect much room for improvement in the rates of unassisted human review”.  The study also notes that techniques for grouping documents, such as near-duplicate detection and clustering, while helpful, are “not the answer”.
  • Computer-Categorized Document Review Techniques May Be a Solution: Techniques such as predictive coding have the potential of reducing the review hours by about 75% with about the same level of consistency, resulting in review costs of less than $2,000 and total production costs of less than $7,000.  However, “lack of clear signals from the bench” that the techniques are defensible and lack of confidence by litigants that the techniques are reliable enough to reliably identify the majority of responsive documents and privileged documents are barriers to wide-scale adoption.

Not surprisingly, the recommendations included taking “the bold step of using, publicly and transparently, computer-categorized document review techniques” for large-scale eDiscovery efforts.

So, what do you think?  Are you surprised by the cost numbers?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Case Law: Inadvertent Disclosure By Expert Waives Privilege

 

In Ceglia v. Zuckerberg, No. 10-CV-00569A(F), (W.D.N.Y. Apr. 19, 2012) (the case where Paul Ceglia is suing claiming 84% ownership of Facebook due to an alleged agreement he had with Mark Zuckerberg back in 2003), New York Magistrate Judge Leslie G. Foschio ruled that an information technology expert’s inadvertent disclosure waived the attorney-client privilege where the plaintiff could not show that it (1) took reasonable steps to prevent the disclosure of the e-mail and (2) took reasonable steps to rectify the error once it discovered the disclosure.

This case involved a dispute over the authenticity of a contract, and in seeking assistance to resolve pretrial matters, the plaintiff filed this motion to compel and asserted, among other things, that the attorney-client privilege should protect an e-mail that was inadvertently disclosed to the defendants. The court set forth the standard under Federal Evidence Rule 502(b) that applies to whether an inadvertent disclosure waives a privilege: “the privilege will not be waived if (1) the disclosure is inadvertent; (2) the privilege holder took reasonable steps to prevent disclosure; and (3) the privilege holder took reasonable steps to rectify the error.” Furthermore, “‘the burden is on the party claiming a communication is privileged” to establish that it met these requirements and that “the opposing party will not be unduly prejudiced by a protective order.”

Because the plaintiff failed to “personally supervise” the actions of the information technology expert he had hired, despite that he understandably hired such an expert to assist him while he was out of town, he “also failed to take reasonable steps to prevent the inadvertent disclosure” of the e-mail. Judge Foschio suggested that instead the defendants could have had the expert “first forward any documents” so that the plaintiff “could have reviewed the documents to ensure there w[ere] no extraneous, privileged materials attached.” If the plaintiff needed to oversee the expert in person, the court admonished, he “should have made himself present to do so.”

Judge Foschio also found that the plaintiff did not take reasonable steps to rectify the inadvertent disclosure. Noting that “the delay in seeking to remedy an inadvertent disclosure of privileged material is measured from the date the holder of the privilege discovers [ ] such disclosure,” and that “[g]enerally, a request for the return or destruction of inadvertently produced privileged materials within days after learning of the disclosure is required to sustain this second element,” the court pointed out that the plaintiff not only waited more than two months to try to rectify the error but also offered no explanation for such a lengthy delay.

Moreover, Judge Foschio stated, “Plaintiff has utterly failed to offer any explanation demonstrating that protecting belated protection of the . . . email will not be unduly prejudicial to Defendants.” Thus, because the plaintiff failed to establish any elements of the test required under the evidentiary rules, any privilege that may have attached to the disputed e-mail was waived.

Case Summary Source: Applied Discovery (free subscription required).

So, what do you think?  Should privilege have been waived or should the plaintiffs have been granted their request for the email to be returned?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Trends: Wednesday LTWC 2012 Sessions

 

As noted yesterday, LegalTech West Coast 2012 (LTWC) is happening this week and eDiscoveryDaily is here to report about the latest eDiscovery trends being discussed at the show.  There’s still time to check out the show if you’re in the Los Angeles area with a number of sessions (both paid and free) available and 69 exhibitors providing information on their products and services, including (shameless plug warning!) my company, CloudNine Discovery, which just announced yesterday release of Version 11 of our linear review application, OnDemand®, and will be exhibiting at booth #216 along with our partners, First Digital Solutions.  Come by and say hi!

Perform a “find” on today’s LTNY conference schedule for “discovery” and you’ll get 21 hits.  More eDiscovery sessions happening!  Here are some of the sessions in the main conference tracks:

10:30 – 12:00 AM:

Information Governance and Information Management

With the volume of electronically stored information (ESI) growing exponentially and the challenges surrounding managing it, protecting it, and developing effective policies are essential. Social media, email, IMs, web pages, mobile devices and the cloud have made a big job even bigger. How much or how little should you collect? How aggressive should you be? How can you be certain your approach and results are defensible?

Speakers are: Richard E. Davis, JD, e-Discovery Solutions Architect & Founder, Litigation Logistics, LLC; Jack Halprin, Head of eDiscovery, Google; Dawson Horn, III, Senior Litigation Counsel, Tyco International and David Yerich, Director, eDiscovery, UHG Legal Department, United Health Group.

The GARP® Principles and eDiscovery

Attendees will hear from experts on the GARP Principles and eDiscovery as well as:

  • Understand the importance of proactive records management through the eight GARP® Principles
  • Revisit the GARP® Principles and learn how their role is magnified by recent case law
  • Learn what to do before eDiscovery: how GARP® precedes and complements the EDRM

Speakers are: Gordon J. Calhoun, Esq., Lewis Brisbois Bisgaard &, Smith LLP; Lorrie DeCoursey, Former Law Firm Administrator and John J. Isaza, Esq., Partner, Rimon P.C.  Moderator: David Baskin, Vice President of Product Management, Recommind.

1:30 – 3:00 PM:

Practical Handbook for Conducting International eDiscovery – Tips and Tricks

This session will present a truly international view on how to conduct global eDiscovery from a practical perspective, including developing proactive global document retention policies and assuring multi-jurisdictional compliance, best practices of global data preservation and collection, successful data migration across jurisdictions, navigating unique cultural and procedural challenges in various global regions, handling multi-lingual data sets as well as strategic positioning of hosting data centers.

Speakers are: Monique Altheim, Esq., CIPP, The Law Office of Monique Altheim; George I. Rudoy, Founder & CEO, Integrated Legal Technology, LLC and David Yerich, Director, eDiscovery, UHG Legal Department, United Health Group.

Litigation Preparedness Through Effective Data Governance

Be prepared. This panel will go through the benefits of data governance in your litigation preparedness and discuss benefits such as:

  • Auto-classification of legacy and newly created content
  • What is email management and is it ready for prime-time?
  • Review the court's findings on the complexities of ESI, including metadata, native formats, back-up tapes, mobile devices, and legacy technology
  • Key questions to ask before outsourcing ESI to the cloud

Speakers are: Lorrie DeCoursey, Former Law Firm Administrator; John J. Isaza, Esq., Partner, Rimon P.C. and Ayelette Robinson, Director – Knowledge Technology, Littler Mendelson.  Moderator: Derek Schueren, GM, Information Access and Governance, Recommind.

3:30 – 5:00 PM:

Managed and Accelerated Review

As costs for review soar and volumes of data multiply at an almost exponential rate, traditional linear review seems to be giving way to new technologies that will enable faster, better, more defensible eDiscovery results. How can you be assured that this new approach will catch everything that needs to be captured? Will human review become obsolete? What do you need to ask when considering this new technology? How should it be incorporated into your overall litigation strategy?

Speakers are: Matthew Miller, Manager, Fraud Investigation & Dispute Services, Ernst & Young; Robert Miller, Founder, Rise Advisory Group, LLC; Former Discovery Counsel, BP; David Sun, Discovery Project Manager, Google.

eDiscovery Circa 2015: Will Aggressive Preservation/Collection and Predictive Coding be Commonplace?

Who's holding back on Predictive Coding, clients or outside counsel? This session will discuss if aggressive preservation/collection of predictive coding will become commonplace as well as:

  • How aggressive should clients be with preservation/collection?
  • How to use effective searching, sampling, and targeting tools and techniques to not over-collect

Speakers are: Gordon J. Calhoun, Esq., Lewis Brisbois Bisgaard &, Smith LLP; Lorrie DeCoursey, Former Law Firm Administrator and Greg Chan, Senior Regional Litigation Technology Manager, Bingham McCutchen LLP.  Moderator: David Baskin, Vice President of Product Management, Recommind.

In addition to these, there are other eDiscovery-related sessions today.  For a complete description for all sessions today, click here.

So, what do you think?  Are you planning to attend LTWC this year?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.