Monitor Performance - Back-end Performance PDF

Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...

Document Details

SupportedAstatine4145

Uploaded by SupportedAstatine4145

Tags

performance monitoring back-end performance API architecture software architecture

Summary

This document details the back-end performance of Configured Commerce, an Optimizely product. It outlines the API architecture, highlighting RESTful interfaces and integration capabilities. The document also mentions how to configure and use the Admin API to manage different aspects of the system.

Full Transcript

Configured Commerce Admin API architecture Overview of the Admin API architecture for Optimizely Configured Commerce. The Optimizely Configured Commerce Admin Console delivers a modularized software architecture for improved performance, as well as easier upgrades and integration. Each module has a...

Configured Commerce Admin API architecture Overview of the Admin API architecture for Optimizely Configured Commerce. The Optimizely Configured Commerce Admin Console delivers a modularized software architecture for improved performance, as well as easier upgrades and integration. Each module has a RESTful interface for communicating between modules and the UI. The RESTful APIs allow third-party applications (client or partner) to read and write administration data without the admin console. Note Access the OpenAPI Specification (aka Swagger) to view and test endpoints by using [YOUR_WEBSITE_URL]/swagger. Admin API Architecture Configured Commerce uses Entity Framework Code First version 6. In general, the entity framework autogenerates OData controllers. Using a custom-built OData T4 template, Configured Commerce autogenerates the OData controllers based on the entities available. The ODataControllers.tt T4 template is maintained as base code in the Insite.Admin library. To extend this functionality, you should implement a custom T4 template independently. You cannot extend the existing T4 template. Configured Commerce supports one level deep of any child navigation property. For example, websites have countries as a child collection. Entities can have properties that are archivable, which means they have an active or deactivated state. Deleting items archives that entity. Archiving is used generically across entities. Archiving an entity is synonymous with the archiving property of that entity type. For example: Setting the DateTime deactivate property to today's date archives the product entity. Setting the Boolean archive property to False archives the customer entity. You can use the Admin API to retrieve archived entities by appending the archivedFilter=1 OData query. The following example retrieves archived customers: /api/v1/admin/customers?&archiveFilter=1 Querying, updating, deleting is all in the Controllers/Odata folder. Consume the Admin API Postman is an API platform for building and using APIs. Postman simplifies each step of the API lifecycle and streamlines collaboration so you can create better APIs. You can also use tools such as Curl or Fiddler. You can download the Postman collection with the requests for this process. To consume storefront REST APIs, create a POST request with URL: [http or https]://[YOUR_WEBS-TE_URL]/identity/connect/token Select Authorization > Basic Auth. Enter the following username and password: Username – isc Password – 009AC476-B28E-4E33-8BAE-B5F103A142BC The image below shows an example with the website URL as localhost:3010. Go to Body, and choose x-www-form-urlencoded. Fill in the following information: Key Value grant_type password [web user name] username for example: basicuser [web user password] password for example: Password1 scope iscapi offline_access Whereas the Admin API requires different scope and user credentials with access to the Admin Console, the clientid:secret are different than the website access. The offline\_access scope returns a refresh token along with the bearer token. The bearer token expires after 15 minutes (900 seconds), so the refresh token allows you to get a new bearer token without specifying the username and password. The refresh token is used in the Admin Console, and the JavaScript automatically gets the new bearer token. The refresh token is stored in localstorage. It is only used when requesting a new bearer token. To consume Admin REST API, create a POST request with URL: [http or https]://[YOUR_WEBSITE_URL]/identity/connect/token Select Authorization > Basic Auth. Use the following username and password: Username – isc_admin Password – F684FC94-B3BE-4BC7-B924-636561177C8F The image below shows an example with the website URL as localhost:3010. Go to Body, and choose x-www-form-urlencoded. Fill in the following information: Key Value grant_type password username admin_[console user name] for example: admin [console user password] password for example: Password1 scope isc_admin_api offline_access If the bearer token has expired, any call made to the Admin API receives a 401 Unauthorized response. For example, this happens if you make a call to http://42.local.com/api/v1/admin/websites using an outdated Authorization bearer token. Because OData exposes the API, you can apply query string parameters using the OData standards, such as count to retrieve a certain number of results. For more information regarding OData parameters please visit the OData URI conventions website. Configured Commerce returns a maximum of 100 results in a Admin API request, and the OData.NextLink requests the next set of results. This OData.NextLink JSON value is the name of the API endpoint plus the query string with skip=100. Return a single object from the Admin API Unlike the website API (the Storefront API), the Admin API uses OData syntax. The following example shows how to retrieve a single product using the Storefront API and the Admin API: Storefront API return a single object syntax – /api/v1/products/f88d5c07-eb72- 42eb-ab36-a5d201168a49 Admin API return a single object syntax – /api/v1/admin/products(f88d5c07- eb72-42eb-ab36-a5d201168a49) Return a child collection from the Admin API Because OData only supports the entity and no child collections, Configured Commerce supports one level deep child collections. You can retrieve child collections on a RESTful JSON result in two ways: 1. Use the expand parameter for the query string. 2. Use the name of the child entity after the slash. The following example shows how to retrieve a child collection of a website using the Admin API: Use the expand parameter for the query string – /api/v1/websites(d24h5c07- eb72-42eb-ab36-a5d201168jh5)?$expand=countries Use the name of the child entity after the slash – /api/v1/websites(d24h5c07- eb72-42eb-ab36-a5d201168jh5)/countries Update entities using the Admin API All entities within Configured Commerce contain a Patch endpoint for the Admin API. The following is an example of how to update a specific website to make it inactive. JSON { "isActive": false } In the body of the request, set the property and value using JSON notation. Additionally, the request header must contain the authentication bearer token. OData Entities The Insite.Data.Entities library contains the models as simple objects. There is no business logic in the entity objects. Configured Commerce replaced the Insite.Model library with the Insite.Data.Entities library and splits all entities into the modules. Configured Commerce follows the UnitOfWork pattern, so there is a data provider implementation for Entity Framework. Consuming the data has not changed. You can use a repository from Unit of work to retrieve an IQuerable object. All entities have the OData RESTful service available. The T4 template generates the classes and defines them as Partial classes for extensibility. You can make any additions or extensions to the auto generated controllers using Partial classes. The Admin API respects any security constraints, including authenticated user. Swagger You can view and interact with both the Storefront and Admin API endpoints via Swagger. Configured Commerce Help contains links for both the Storefront API and Admin API. > 📘 Note You can also view the available endpoints by scrolling in the left navigation to Storefront API V1, Storefront API V2, or Admin API V1. Admin API Endpoints The following definitions describe the API endpoints for data-level objects, also known as Admin objects. Generic endpoint definitions Url Prefix: /api/v1/admin/ HTTP URL Description Verb Retrieves all of the the GET entity objects POST entity Creates a new object Retrieves the object by GET entity({id}) Unique Id Update or create an PUT entity({id}) object by Unique Id DELET Deletes an object by entity({id}) E Unique Id Updates properties on PATCH entity({id}) an existing object by Unique Id Returns a new instance GET entity/Default.Default() of that object with all of the default values Retrieves a single child GET entity({key})/child({childKey}) by Unique Id relative to the parent object Retrieves the value of a entity({key})/customproperties({custompropert GET single custom property yKey}) on an object Work with the data model and Entity Framework ORM Describes goals, practices, usage, extensibility, and performance for the Configured Commerce data layer. Suggest Edits Optimizely uses Entity Framework for Configured Commerce. The following topics help you acclimate to the new data layer: Goals and best practices Basic usage Extensibility Performance Goals and best practices The goals of the Configured Commerce data layer are: 1. Use the Entity Framework. 2. Encourage a "thin" data layer, or a data layer whose entities have little to no business logic. Entity Framework Entity Framework (EF) is open source and part of the.NET framework. EF is the default object relational mapper (ORM) for Configured Commerce. This type of tool simplifies mapping between objects in software to the tables and columns of a relational database. Skinny data model We want to host all business logic in the service layer, leaving the entities as simple as possible. This eliminates complexity in the data layer and allows for better control and extensibility throughout the platform. We encourage you to follow this paradigm as you extend the platform. Insite.Data The Insite.Data namespace is the root namespace for entity-related components in the Configured Commerce framework. The models are Insite.Data.Entities and many other data- related components are included this namespace. Insite.Data is the main dependency for all data layer-related logic. Basic usage Unit of work and repositories The recommended way to access data is to obtain a UnitOfWork object, get the appropriate Repository object, and use that repository to access the entities. Retrieving entities To get an entity by Id, get the object by its primary key by calling the Get method on the corresponding repository. Get a list of entities To retrieve a list of entities, call the GetTable method, which returns an IQueryable that can be filtered (where), sorted (orderby) or mapped (select). Also, if you would rather write your own optimized SQL for retrieving the entities, you can do so and call GetList. Note This will return the values immediately. This method should be used with care and only when necessary. Retrieve related data Lazy loading By default, lazy loading is enabled and can be used to retrieve entities and collections of entities related to your entity in the database. A separate SQL call will be made once the collection is accessed in the code to retrieve these entities. In the example below, there are two SQL calls made, one for the Get and one when Job.CustomProperties is accessed. Eager loading Although lazy loading is great for a number of use cases, it is sometimes advantageous to load collections ahead of time if you know that you are planning to access them. This optimizes the process by making only one SQL call. In our last example, we always access the CustomProperties collection, so instead of lazy loading that collection, we can eagerly load it via the Include method provided by Insite.Data.Extensions. Note You must return an IQueryable to use this method. Update entities Create an entity Creating an entity consists of constructing a new instance of the desired entity type, and calling the Insert method on the corresponding repository. Note To actually save the entity to the database, you will also need to call either the Save or SaveAsync method on the UnitOfWork. That will be covered in the next SubTopic. Update an entity Updating an attached entity requires an update to the instance, followed by a call to the Save method on the UnitOfWork instance. You can also perform a Save asynchronously by calling SaveAsync on the UnitOfWork instance. Note You do not have to call Save or SaveAsync immediately after updating the entity. Delete an entity Deleting a job requires you invoke the Delete method on the Repository, followed by Save or SaveAsync on the UnitOfWork. Note You do not have to call Save or SaveAsync immediately after updating the entity Extend the data model Alter the data model The officially supported best practice and method for changing the data model is to add entities to the model. Removing Optimizely tables is not supported. See Creating Custom Tables with an Entity and WebApi for details about table extension. Create an entity class To add to the data model, you should first create a POCO (Plain Old CLR Object) that extends from Insite.Data.Entities.EntityBase. For this example we have created the Job class. In the previous image, notice that a Table attribute/annotation was added to this class. This is superfluous as convention assumes the name of the class. A list of common annotations for EF can be found in Code First Data Annotations. Add properties Once your class is created, you can begin to add properties. Properties can be of varying types, including other entities, but must have "virtual" as part of their signatures. Keeping in mind the principle of a "thin" data layer, it is best practice to leave out computed properties or methods. Our example adds the following: Map the entity Once you have created your entity class, you must create a mapping class. This class defines any custom relationships that are outside of standard entity framework conventions. Since our example follows convention, we don't require any logic. Still, we need to create the class, and it must extend from Insite.Data.Providers.EntityFramework.EntityMappings.EntityBaseTypeConfiguration for your entity to be bootstrapped to the context correctly. Primarily, you will only need to add custom mapping logic to your mapping class if there is a many- to-many relationship between your entity and another. If this is the case, then you can follow standard entity framework guidance on creating this mapping. Alter the database schema Migrations Migrations are not currently supported. Any changes made to the database schema need to be done via SQL files embedded in your custom dll's. Create a script Creating a script consists of adding a new SQL file to your Visual Studio Project. The file should have the format of.....sql. These can be added anywhere in the project's folder structure. Include a script Including a script into the build consists of setting the Build Action property of the file to Embedded Resource and the Copy to Output Directory property to Do not copy. Run a SQL script SQL scripts that are embedded resources are run automatically during the bootstrap operation. If the SQL script executes successfully, it is added to the DatabaseScript table in the database. This is done for auditing purposes, and to ensure that a script is not run twice. You can query this table to determine if your script has been run. Performance considerations Retrieve records WhereContains There is a known issue using Linq with Entity Framework in conjunction with ICollection.Contains() from within a Where clause. For example: Using this approach can severely degrade the performance of your application. To alleviate this issue, Optimizely has provided an extension method, WhereContains, which gives the same result in a much more performant manner. The same method as above should be refactored to the following: WithNoTracking Another optimization that can be made is to work with a set of data that doesn't have change tracking enabled. This improves read performance, and can also improve the performance for batch operations. Entity Framework accomplishes this via the AsNoTracking extension method, found in DbExtensions.AsNoTracking Method. To limit dependencies on Entity Framework, Optimizely Configured Commerce exposes the same functionality on all of their repository instances. To get a collection without tracking, invoke the GetTableAsNoTracking method on the Repository. For example: C# public ICollection GetJobCustomProperties(Guid jobId) { var job = this.unitOfWork.GetRepository().GetTableAsNoTracking().Where(j => j.Id == jobId).Include(j => j.CustomProperties).FirstOrDefault(); return job.CustomProperties; } Typical usage would be scenarios where no updates to the entity are expected, or optimized sections where ObjectState and attachment to the context will be actively managed. Improve Configured Commerce performance Updated 7 months ago FollowNot yet followed by anyone Optimizely Configured Commerce is a large and complex set of modules and APIs offering a tremendous depth of capabilities for B2B distributors and manufacturers. While we strive to ensure performance is one of the key “abilities” we focus on, it is also balanced against maintainability of the code, modularity of our tooling, testability and having the ability to move quickly. Software always includes some set of tradeoffs to deliver maximum value. The intent of this document is to outline some best practices and some pitfalls to watch out for when developing your Configured Commerce solution. This is by no means an exhaustive list of things to watch out for but should provide a good baseline and approach for improving overall performance of your site. Types of Performance Bottlenecks There are many different types of areas that could cause performance issues but they tend to fall into one of several general categories: Front end performance – these are the types of problems that comes from having bad Javascript code, too many libraries to download and very large images. These problems tend to “hit” every user the same but will be exacerbated by slow connections such as on a mobile device running a responsive website. API performance – these are problems that can crop up due to coding issues in the APIs, looping issues (that is running the same code over and over), or database issues. These can crop up either for a single user or get worse as load increases. These problems may be in base code potentially or in custom code. 3rd Party Interactions – this class of problem typically stems from using an external provider such as in the case of real-time calls to the ERP for pricing. These are often the most difficult to manage since they tend to be out of Optimizely’s or the Developer’s control. There are, however, some strategies to employ. The General Approach The approach for performance falls into three broad categories: Planning - It’s never too early to think about performance. Thinking through the approach to given customizations or even implementation options should consider performance impacts. For example, while one could leverage Promotions to handle some specific type of pricing issue, if there will end up being a promotion for every product (resulting in 10,000+ promotions), then it will be far better to determine how to resolve the issue within the pricing pipeline and NOT leverage the promotions engine. It’s always going to be more effective to anticipate and plan for a bottleneck than to resolve it once it’s discovered. Detection - The next level is to identify where the problems exist. There are many tools and approaches to determining where a problem exists. Most of these are generic and not specific to Configured Commerce so we will not dive too deeply into the exact tooling. Resolution - Finally, implementing a change to the approach or code will be the ultimate way to resolve a performance problem. It will be important to try and establish a baseline for a particular problem to see if the proposed resolution will actually address the problem detected. When developers do their work, they are typically working with small datasets which can hide performance problems that will be exposed in a production environment. Internally, we use a Performance database that has significantly more data and introduces some better edge cases for testing. We have a set of tests, that are enhanced over time, that exercise the system by using synthetic transactions against this Performance database. See the chart at the end of this document with the number of records we use for testing – if your specific instance has a significantly larger number of records than we test with, review these areas for performance. It will be likely that, for example, if we test with 50 active promotions and you have 1,000 active promotions, that there could be a significant performance issue. The rest of this document will focus on common areas where performance issues can crop up and how to attempt to identify and/or resolve them. Front-End Design The following are areas that we have found that can be tuned for performance. We use Chrome tools and Web Page Test (www.webpagetest.org) for testing the theme and front end. Since our application is a single page app (SPA), we load all the Javascript libraries on the first page load. This can take a hit on the site but especially if a lot of additional JS libraries are added to the project. These are not in any specific order of importance. Too much on the home page - Remembering that the home page takes a big hit on the Javascript front and is the first page that users see, it’s important to have the best imagery and loading strategy on that page specifically. Adding too many large images or data controls will slow that page down. Large Images - It’s important to optimize the images to the size being displayed. The source file being large is fine if you are resizing that image to the optimal display size for the actual site rather than letting the browser. Large Javascript Libraries - Try to leverage the existing libraries whenever you can. The current site uses libraries which will typically have the capabilities you want. Make sure you are intentional about adding additional libraries and try not to add large libraries to do a single function. Minimize Fonts - Font files are quite large. If you want an icon from a font, consider isolating that component rather than downloading the entire font. Caching Caching is the single biggest tool you can leverage to improve performance but it must be used judiciously. Configured Commerce makes heavy use of caching throughout the application but you must opt into the various settings and then implement caching strategies for custom code. eTag Caching -This option is used to determine the state of an API request on the server so if a subsequent request is made, it will return a status 304 (not modified) indicating that the data has not changed. This prevents the server from having to recalculate the data or marshal it back from the server to the web client, additionally improving performance. Also see Enable eTag caching for more considerations. Cache Manager - Developers can make use of the cache manager within Configured Commerce to cache important data that does not change with great frequency. Consideration must be given to how much data is stored in the cache to prevent automatic cache eviction with too much data. Additionally, the developer needs to determine how long the cache should live against the cost of reloading that cache. Shared (distributed) Cache - Configured Commerce also has a distributed cache capability via either Redis or SQL Server. This is currently used primarily for real- time inventory and pricing calls to minimize the number of calls made to the ERP. Since cache is stored in memory on each web server, using the distributed cache will first try to load the data from server member and, if not available, will attempt to load it from the shared cache and cache it locally. This layered cache approach can make very significant improvements in overall performance. Cache Settings - Make sure to turn on the cache settings (CMS Content, Category Menu) and set the refresh minutes as high as is reasonable. Typically caching is disabled or a low value put in place during testing so that changes are seen more quickly. Database Access The largest area for performance issues will typically come from calls to the database. This can be either too many calls, retrieving too much data or poor queries. The shape of the data will often impact the general performance of the database server and since each site is different, it’s difficult to give specific direction as to how to detect and resolve these items. General Approach - The best way to determine performance issues is to use a tool like dotTrace to see the specific calls and how long they take coupled with monitoring a local copy of the database using SQL Server Profiler. Any calls that are taking longer than, say 300ms, are targets for additional research. Query Format - Take special care when looking at LINQ queries to make sure they are just getting the data you want. Index Hits - As you profile the site locally, you may notice specific queries are table scanning – this happens more frequently with a large number of joins. There are a number of techniques that can be deployed including writing direct SQL queries (instead of LINQ), creating a stored procedure, or even creating a supporting table with the data you need. If there is an index missing on standard Configured Commerce data, feel free to submit that as a ticket to support and Engineering will consider adding indexes for performance. Sometimes, as we have found, adding an index fixes a problem in one site and makes it much worse on another site – this all depends on the specific data. Expands - Each of the APIs generally has options to expand and retrieve related data. Since these always represent additional database calls, only ask for those things out of the API you actually need. Areas to Monitor The following are a series of specific areas you should look at or watch out for relative to performance. Products - Products are the largest and most important construct in the system and is the most “expensive” resource to retrieve. As you design, make sure to get the products you need but realize that the more products you retrieve, the slower the system will be. o Try to limit the number of entries in any given cross-sell/accessory/related product widget since each must retrieve the product and often needs to calculate inventory and pricing as well. o Keep the default page limit on the product list and search results page as low as feasible (default = 8) so that too many products don’t have to be retrieved concurrently. This, of course, should be balanced against the overall needs of the site. The idea here is not to design the system to retrieve, say, 50 or 100 products concurrently. o Variant Products – when designed, we expected these products to have 2-4 traits which the standard design supports. Having, say 10+ traits could present some display and interaction impacts on the product detail page. Additionally, if you have more than 40 variants on a product, there may be issues in pricing and inventory that must be addressed. Custom Properties - Custom Properties are a very handy way to extend the data model for a given entity since they are generic and configurable. Each one is represented in the database as a name/value pair so the records are compact. Specifically when related to the Product entity, you must weigh the convenience of custom properties against performance concerns. For most entities, the system is not constantly reading different data so retrieving, say, 50 custom properties for a customer is likely not going to be a problem. Take those 50 properties to the Product or Category tables, for example, you will likely have some performance concerns. One option is to alter the ProductCollection pipeline to only retrieve those properties you actually need and another approach is to use a custom table for the product instead of custom properties. Categories - The category structure should generally not be more than about 3 levels deep. Studies show that most people will begin with search and not navigate categories or, if they do, using filters such as the attribute filters is a better experience than going through endless levels of category taxonomy. Customer Segments - These are another powerful capability of the system and we encourage you to use segmentation to best present your merchandising message. We would only caution that having too many segments can cause management of the system and content to be confusing. When you are on a page in the site and show page source, you will be able to see the list of assigned segments to the user (label = Personas). Attribute Types/Values - Having an excess of attributes can make search and faceting take a bit longer and can render a results page with a very long list of attributes rending the user experience confusing. Make sure to think through what attributes you want to expose for faceting and comparing, in particular. Calculating Tax/Shipping in Cart - Tax is often calculated externally by doing an API call to a tax service or the ERP to calculate the tax on the cart. This is another “expensive” operation. We have designed the system to not have to recalculate when going to the checkout page, but people tend to go to the cart far more frequently than they go to the checkout page. Shipping is the same – we call out to each of the enabled services and that can be a hit to performance – wait until you need to do it. This is controlled by a system setting. Promotions - Promotions are a very powerful capability of the system. The way they are calculated, however, is that every active promotion is run, typically on the checkout page. While powerful given its metadata configuration approach, it also can be a bit slow. Having 10 or 15 concurrent promotions should present no significant problem but having 500+ will. The allow multiple promotions being set to NO is only of limited help since it only applies to the order – the system must still check for all line level promotions. Languages - The number of languages is not a significant issue in the system EXCEPT when building the Elasticsearch index. The system effectively builds a separate index for each active language so only activate the languages you require. RealTime Services - It is common practice to use APIs to retrieve pricing and inventory from the ERP which is perfectly fine to do. If the ERP allows for multiple products to be priced and inventory comes back with the same API call, this is ideal. If the ERP requires a separate call per product and, additionally, requires an additional call for each inventory item, the latency will make the system very slow. In this case, we would suggest using a Refresh job for inventory and only relying on real-time calls for pricing. Translations - Try to make sure to only create transaction records for translations. If you have multiple languages and use the Generate Records option in the Admin Console, the system will create many empty records which can slow down the system. If you don’t use translations at all, for even better performance, disable the Enable Translation Properties option. Code-Specific Things to Monitor.ToList – on LINQ queries, anything in a statement filter that can be converted to SQL must be included BEFORE you enumerate so you are retrieving only the data you need and don’t overload the Entity Framework context. You do not want to filter data after the data is marshalled to the client. Projections - Use projections if you do not need every field. For example, instead of First().Id, use.Select(o => o.Id).First(). You can project parts of objects using.Select(o => new { o.Id, o.CreateBy }) to extract just the Id and CreateBy columns. This reduces database load, network bandwidth, and EF processing time.Any - Use.Any() instead of.Count() >= 1..Any() stops after the first hit, while.Count processes every candidate. Only Return Required Data – all of the Storefront APIs have the ability to designate the specific fields to bring back in the resultset by adding a query string of &select=field1,field2…. Remember that this reduces the payload but does not lighten the query load on the servers. Repeating Queries - Do not run repeating queries inside a loop. Scaling is very poor; each query requires a network round trip, causing a linear growth in run-time with more loop iterations. Instead, query for all needed data outside the loop. This will perform better even if some of this data is not used. Stored Procedures – in general, Entity Framework will run fine and allows for ease of upgrading, simplified code, and so on but there are times where performance is paramount. Using a stored procedure will outperform EF with the same queries, support multiple queries in a single network request, and can efficiently process large lists of input data via Table Valued Parameters. If you are unable to tune EF to deliver acceptable performance, (even after following all of the above advice), stored procedures are a possible solution. Entity Framework tracking – when using EF, it tracks the state of all of its objects to ensure that if something is changed, it will be committed back to the database. This is very convenient and a key feature of an ORM but takes additional overhead. If you are retrieving data that you know does not need to be modified, use GetTableAsNoTracking() when retrieving that data. If you are not using that, be mindful of how many entities you are loading into the context – you should limit it to no more than 100 and then call UnitOfWork.Save() and UnitOfWork.Clear() to limit the overhead of EF change tracking. Performance Database The following is a list of the current database we use for performance testing. You should confer with this as a guideline of the number of items in given tables that we test with and if your actual count is significantly greater, then it’s an area you should pay particular attention to relative to performance. There is no guarantee that the shape of your data will be similar enough to this db such that things that are fast for our db are fast for yours or that if you have significantly more records in a given table that it would necessarily be a problem. This is provided simply as a guideline. Keep in mind that these target datasizes are intended to exercise the system with reasonable loads as a baseline but it is not intended to exercise at limit. These numbers are based on existing sites and performance areas we want to be sure to test with. Table #/Records Notes Attribute Types 1,000 Attribute 50,000 Values Carriers 50 Categories 1,000 Content 10 variant homepages Content Item 50,000 records Customer 52,000 shipto 100 shiptos assigned to a single customer Custom 720,000 10 each for customer Property and products Customers - 1,000 customers to a Sales person Salesperson Dealer 50 in the same geographical location Document 25 assigned to a product Experiment 10 Global 250 Synonyms HTML 250 Redirects Images 25 Images assigned to a product Invoice History 100 for a single customer Invoice History 100 for a single order Line Language 3 Order History 100 for a single customer Order History 100 for a single order Line Persona 10 Customer Segments Product 165,000 Product Unit of Units of Measure 10 UoM assigned to a Measure product Product 25 attributes for a single product Attribute Values Product Cross 100 for a product Sells Promotions 75 Restriction 5,000 Groups Restriction 6,900 Group Customer Restriction 190,000 Group Product Section 10 sections with 10 options per sections Use for configured products Specifications 10 for a product Variants 10 variant parents with at least one one Category Alpha > with 225 children--all variants in one Category005 category Product 1000000 - 1000010 Product 1000000 has 225 children Previous article Expose Prometheus metrics

Use Quizgecko on...
Browser
Browser