Exam 70-487: Accessing Data - How to choose the Appropriate Data Access Technology - Entity Framework?


Since the days of Visual Studio 2008 and .NET Framework 3.5 in 2008, Entity Framework has served as the primary data-access technology for the .NET ecosystem. Entity Framework is an object-relational mapper (O/RM) that enables .NET developers to work with a database using .NET objects. It eliminates the need for most of the data-access code that developers usually need to write.

While Microsoft is aggressively pushing for cloud version of pretty much every piece of software they have ever created, Entity Framework still enjoys tremendous popularity, it was last year described by Microsoft as the most popular packagelisted by NuGet.org. Which is normal because "Entity Framework is Microsoft's recommended data access technology for new applications.", as Microsoft puts it.

A comprehensive version history from EF 3.5 to EF 6.2 of today can be found here and the current state of EF and future updates (Entity Framework Core, Entity Framework 6, Compare and choose between EF Core & EF6) can be found at the Entity Framework Documentation page.

For the purpose of Exam 70-487, most of the focus will be on Entity Framework 6.

Once upon a time in most computer Science classes, the most prevalent metaphors in programming were OOP (Object Oriented Programming), which provides a very intuitive and straightforward way to model real-world problems; and RDBMS systems' ANSI SQL (Structured Query Language) with its multiple vendor-specific flavors (Transact-SQL or T-SQL in the Microsoft world).

In most non trivial applications, developers discovered that there were significant gaps between the Object Oriented model they came up with and the ideal structure they came up with for data storage, referred to as "Impedance mismatch" and for those who coded through those days, this required a lot of code to properly deal with.

To help solve the impedance mismatch problem, a technique known as object-relational mapping (ORM, O/RM or O/R mapping) was created. Many of us in the .NET ecosystem flocked to NHibernate . At that time, Microsoft's first attempt to build an ORM tool with something called LINQ-to-SQL (LINQ or Language Integrated Query was introduced by Microsoft to allow .NET developers to interact with data).

It was in this context that Microsoft embarked on the Entity Framework rework as an ORM, and in the last decade, it has become the de facto data access technology for .NET. The primary benefits of EF is that it enables developers to manipulate data as domain-specific objects without regard to the underlying structure of the data store. In EF parlance, this is known as the conceptual model.

As you study EF and prepare for the exam, it is important to know that there are three parts to EF modeling, and critical to understand them all and the role each plays:


  • The Conceptual Model: handled by the CSDL or Conceptual Schema Definition Language (in older versions of EF, this existed in a file with extension .csdl)

  • The data storage: handled by the SSDL or Store Schema Definition Language (in older versions of EF, this existed in a file with extension .ssdl)

  • The mapping between the conceptual model and the data storage: The mapping between CSDL and SSDL is handled by the MSL or Mapping Specification Language (In older versions, this was in file with extension .msl)

Now, note that in the current versions of Entity Framework, the .CSDL, .SSDL and .MSL all exist in a single file called .edmx.



With this level of modularity in EF modeling, you can easily swap the back end components without affecting the conceptual model by allowing the changes to be absorbed by the MSL's mapping logic.

This is where, in practical use and for the exam 70-487, you need to understand that changing the underlying data store with ADO.NET that we previously studied, you will have to re-write a lot of code, whereas with Entity Framework, just the mapping will be updated to handle the change. And this could justify the choice of one or the other technology.

As a developer, you will be more concerned with the conceptual model, as an architect or Technical Lead, you must have a thorough understand of all moving parts.

The first thing to do when working with EF is to build the models. There are two basic ways: you can use the set of EDM (Entity Dat Model) tools to create your conceptual model. The first way is called Database First (Build a database or use an existing one to create the conceptual model); the second way is the reverse trip, called Model First (building your conceptual model first from POCO or Plain Old CLR Objects and let the tools build a database for you).

When you create a new EF project, you create an .edmx file. The current toolset includes four principal items that you need to understand:



  • The Entity Model Designer: Creates the .edmx file amd enables you to create, update or delete entities, manipulate associations, mappings and inheritance relationships.

  • The Entity Data Model Wizard: The starting point of building your conceptual model used in the DataBase First scenario.

  • The Create Database Wizard: Used to build a database from the conceptual model in a Model First Scenario.

  • The Update Model Wizard: After the model is built, it allows you to update every aspect of it.


The above four components must be mastered in order to face the exam confidently, and be efficient in any real-life project. Jump to Visual Studio and try all these tools.







Comments

Post a Comment

Popular posts from this blog

Why can't Microsoft install IIS on Window Operating Systems by default?

Exam 70-487: Accessing Data - How to choose the Appropriate Data Access Technology - Azure Cosmos DB?