top of page

Search Results

Search Results

Search Results

175 results found with an empty search

  • What Is The Power BI Data Gateway

    The on-premises data gateway allows users to schedule data refreshes, which can be used to update the data in Power BI with the latest data from the on-premises data sources. It also provides secure data transfer between the Power BI service and the on-premises data sources, using encrypted connections and secure authentication methods. Overall, the on-premises data gateway is a useful tool for organizations that want to access and analyze data that is stored on-premises using the Power BI service. It enables users to connect to and access their on-premises data from within Power BI, without the need to move the data to the cloud. A data gateway offers one option for all of your cloud applications; for example, with a Power BI gateway, Power BI, Power Apps, Azure Logic and its Apps, and Power Automate can all be used. Installation is easy, and cloud credentials are encrypted and decrypted within the gateway. There are two types of data gateways that can be used with Power BI: the on-premises data gateway and the personal gateway. On-premises data gateway: This is a service that allows users to connect to and access data that is stored on-premises (i.e., within an organization's own infrastructure) from Power BI. It acts as a bridge between the Power BI service (which is a cloud-based service) and the on-premises data sources, enabling users to access and analyze data from within Power BI without the need to move the data to the cloud. Personal gateway: This is a standalone gateway that can be installed on a personal device, such as a laptop or desktop computer. It allows users to access and refresh data from on-premises data sources, such as databases and file systems, from their personal devices. The personal gateway is particularly useful for users who want to access and refresh their data when they are not connected to the corporate network. The personal gateway for Power BI is a standalone gateway that can be installed on a personal device, such as a laptop or desktop computer. It allows users to access and refresh data from on-premises data sources, such as databases and file systems, from their personal devices. The personal gateway is particularly useful for users who want to access and refresh their data when they are not connected to the corporate network. There are a few limits to the personal gateway for Power BI that users should be aware of: Refresh frequency: The personal gateway has a lower refresh frequency compared to the on-premises data gateway. The maximum refresh frequency for the personal gateway is 8 hours, while the on-premises data gateway can be configured to refresh data more frequently, up to once per hour. Data volume: The personal gateway has lower data volume limits compared to the on-premises data gateway. The personal gateway can handle up to 10GB of data, while the on-premises data gateway can handle much larger volumes of data. Concurrent refreshes: The personal gateway is limited to two concurrent data refreshes at a time, while the on-premises data gateway can handle more concurrent refreshes. Data sources: The personal gateway supports a limited number of data sources compared to the on-premises data gateway. The personal gateway supports a subset of the data sources supported by the on-premises data gateway. Overall, the personal gateway for Power BI is a useful tool for individual users or small organizations who want to access and refresh their data from on-premises data sources when they are not connected to the corporate network. However, it has a lower refresh frequency and lower data volume limits compared to the on-premises data gateway, and it supports a limited number of data sources.

  • Why Choose Power BI

    Microsoft is the leader in data and analytics, and this is reflected in Gartner's in this Magic Quadrant shown below. Microsoft has momentum in the data analytics space because of its offering in Office 365. Many of my clients ask me to create a cost comparison between Tableau and Power BI only to find out they have already purchased Power BI in their Office 365 E5 subscription; this means setup is only a few clicks away. Power BI offers data preparation, visual-based data discovery, interactive dashboards, and augmented analytics. Power BI Strengths From Gartners Alignment with Office 365, Teams and Azure Synapse: Power BI in Office 365 E5 SKU has given employers a huge reach for their technological models. The ability to merge Power BI and now "goals" within the same Teams interface is a key player in attracting business users. Power BI and Azure Synapse alignment boast the things employers have been seeking. Price/value combination: Power BI does not sacrifice quality for cost.. Power BI cloud service has a myriad of facets that may be just right for your business! Power portfolio and product ambition: Microsoft knew what it was doing with Power BI, Power Apps, and Power Automate. Power Apps can be embedded in Power BI dashboards or access Power BI datasets, and Power Automate flows can be constructed to take different actions based on data. AI-powered services, for example text, sentiment and images, are available within Power BI Premium. Get the full report from Microsoft https://powerbi.microsoft.com/en-us/blog/microsoft-named-a-leader-in-the-2022-gartner-magic-quadrant-for-analytics-and-bi-platforms/ Power BI vs Tableau Power BI and Tableau are both popular business analytics tools that can be used to create and share data visualizations, dashboards, and reports. Both tools have their own strengths and weaknesses, and which one is "better" will depend on the specific needs and preferences of the user. That being said, here are a few reasons why some users might prefer Power BI over Tableau: Ease of use: Power BI is generally considered to be easier to use than Tableau, especially for users who are new to data visualization and business analytics. Power BI has a more intuitive interface and a lower learning curve, making it easier for users to get started and create their first visualizations and reports. Integration with Microsoft products: Power BI is a Microsoft product, and it integrates seamlessly with other Microsoft products, such as Excel and SharePoint. This can be a major advantage for organizations that use a lot of Microsoft products and want to use a business analytics tool that is well-integrated with their existing infrastructure. Cost: Power BI is generally less expensive than Tableau, especially for smaller organizations or individuals who only need a basic set of features. Power BI offers a free version with limited capabilities, as well as paid versions with more advanced features. Again, it's worth noting that both Power BI and Tableau have their own strengths and weaknesses, and which one is "better" will depend on the specific needs and preferences of the user. It's a good idea to try out both tools and see which one works best for you. Power BI vs Looker Power BI and Looker are both business analytics tools that can be used to create and share data visualizations, dashboards, and reports. Both tools have their own strengths and weaknesses, and which one is more suitable will depend on the specific needs and preferences of the user. Here are a few advantages of using Power BI compared to Looker: Ease of use: Power BI is generally considered to be easier to use than Looker, especially for users who are new to data visualization and business analytics. Power BI has a more intuitive interface and a lower learning curve, making it easier for users to get started and create their first visualizations and reports. Integration with Microsoft products: Power BI is a Microsoft product, and it integrates seamlessly with other Microsoft products, such as Excel and SharePoint. This can be a major advantage for organizations that use a lot of Microsoft products and want to use a business analytics tool that is well-integrated with their existing infrastructure. Cost: Power BI is generally less expensive than Looker, especially for smaller organizations or individuals who only need a basic set of features. Power BI offers a free version with limited capabilities, as well as paid versions with more advanced features. Here are a few advantages of using Looker compared to Power BI: Data modeling: Looker is known for its powerful data modeling capabilities, which allow users to create complex data models and perform advanced analysis. SQL support: Looker is built on top of a SQL database, and it provides a wide range of tools and features for working with SQL data. This can be a major advantage for users who are comfortable with SQL and want a tool that is optimized for working with SQL data. Customizability: Looker allows users to customize their dashboards and reports using LookML, a proprietary language that is specifically designed for data modeling and visualization. Power BI Vs Qlik Sense Power BI and Qlik Sense are both business analytics tools that can be used to create and share data visualizations, dashboards, and reports. Both tools have their own strengths and weaknesses, and which one is more suitable will depend on the specific needs and preferences of the user. Here are a few advantages of using Power BI compared to Qlik Sense: Ease of use: Power BI is generally considered to be easier to use than Qlik Sense, especially for users who are new to data visualization and business analytics. Power BI has a more intuitive interface and a lower learning curve, making it easier for users to get started and create their first visualizations and reports. Integration with Microsoft products: Power BI is a Microsoft product, and it integrates seamlessly with other Microsoft products, such as Excel and SharePoint. This can be a major advantage for organizations that use a lot of Microsoft products and want to use a business analytics tool that is well-integrated with their existing infrastructure. Cost: Power BI is generally less expensive than Qlik Sense, especially for smaller organizations or individuals who only need a basic set of features. Power BI offers a free version with limited capabilities, as well as paid versions with more advanced features. Here are a few advantages of using Qlik Sense compared to Power BI: Data discovery: Qlik Sense is known for its powerful data discovery features, which allow users to quickly and easily explore and analyze data from multiple sources. Qlik Sense provides a range of visualization options, as well as advanced filtering and drill-down capabilities, to help users uncover insights and trends in their data. Data integration: Qlik Sense provides a range of tools and features for integrating data from multiple sources, including databases, flat files, and cloud-based data sources. Qlik Sense allows users to easily combine data from different sources, clean and transform the data, and create a unified data model for analysis. Collaboration and sharing: Qlik Sense provides a range of tools and features for collaboration and sharing, including the ability to publish dashboards and reports to the cloud, share them with other users, and collaborate on them in real-time. Qlik Sense also provides a range of options for scheduling and automating the distribution of dashboards and reports. Customization: Qlik Sense allows users to customize their dashboards and reports using Qlik Sense Extensions, a JavaScript-based framework for building custom visualizations and functionality. This allows users to extend

  • What are Slowly Changing Dimensions (SCD)?

    A slowly changing dimension is a type of dimension in a data warehouse that changes over time but not rapidly. For example, a customer dimension might include attributes such as name, address, and phone number, which may change slowly over time. In a data warehouse, slowly changing dimensions are used to track changes to the attributes of a dimension over time so that historical data can be analyzed and compared to more recent data. There are several different types of slowly changing dimensions, including Type 1, Type 2, and Type 3, which represent different strategies for handling changes to dimension attributes. Type 1 Slowly Changing Dimensions: Type 1 slowly changing dimensions overwrite the previous data when an attribute changes. This means that if an attribute of a dimension changes, the new value will replace the old value, and any historical data will be lost. This is the most straightforward approach to handling changes to dimension attributes, but it does not allow for tracking the changes over time. Type 2 Slowly Changing Dimensions: Type 2 slowly changing dimensions create a new record when an attribute changes. If an attribute of a dimension changes, a new record is created to reflect the new value, and the old record is preserved. This allows for tracking the changes to dimension attributes over time. Still, it can result in many records for a single entity, which can make it more difficult to analyze the data. Type 3 Slowly Changing Dimensions: Type 3 slowly changing dimensions create a new record when an attribute changes and add column to the record to store the previous value. This attribute of a dimension changes, a new record is created with a new column to store the old value, and the old record is preserved. This allows for tracking the changes to dimension attributes over time, and. It more efficient approach than Type 2 slowly changing dimensions, as it does not result in multiple records for a single entity. However, it does require adding new columns to the record, which can make the data model more complex.

  • Why I Started My Own Company (Hint It Was Not For The Salary Hours Or Health Plan)

    I formed Bennyhoff Products And Services LLC because I was laid off 6 times in 10 years and I wanted more stability, control and stability in my professional life and brand. In my early career starting in 2000, I thought employment at large companies would insulate me from swings in the market and provide a steady path for growth. In 2000, I was a new father and simply wanted fair wages for fair work. One of my managers at Sysco Foods, and still my friend to this day, said “this business is recession and depression proof and he was right. However, the business was not immune to changes technical landscape. The outsourcing fad soon shortly engulfed the company and my job class was eliminated. My manager called me at 11:00 PM and tell me my position had been eliminated. This was a surreal moment for me as I was standing in line in an LA emergency room. My child starting puking and would not stop so sought emergency services. Bad timing does not begin to describe the situation. –Jim, if you are reading this, you were the best manager I ever worked for, despite the late-night call! In another instance in 2009, I was told the company was reorganizing but that I would be allowed to interview for the same position I had previously been doing for the past 3 years. In that particular instance, all of the people in the room looked around and all collectively asked did he just say what I thought he said – it was a made-for-TV moment. These two instances make for funny memories now but were crushing when they occurred. The four other times I was laid off were handled professedly with no drama but the effect was still the same Re-vise Resume, Re-apply, and Re-Assure my family that everything was going to be OK. The last time I was laid off I decided to change up the pattern and ensure that if one employer decided to not use my services that would be OK as I would have three more to make up the difference. The only way to do this was to form a company, the company was called Bennyhoff Products And Services LLC. The idea to start my start and own a company had been maturing in my head for about 10 years but I never had the guts to pull it off, until I was laid off again in 2013. The time was right, and I wanted more. I wanted to secure steady employment for my family and not have to worry about keeping my resume up to date. I wanted to do business on my terms. I wanted to control how my brand was viewed. I wanted to enjoy the people I work with and did not feel like leaving any of this to a standard employer-employee relationship was good enough. I formed BPS with the following thoughts. • I do quality work and work with only the best clients. Most of my customers have become my friends – I even married one! Let's start a partnership, where I can help your organization deliver on your IT / Data goals. • I believe in honesty and delivering value to your company. If I cannot do it I will tell you and recommend someone else. If you do not like what I have delivered, you do not pay for it, no paperwork, no runaround. • I believe in making your job easier, that is why you hired a consult right, this; means doing the work your way. Using the proper processes and procedures. Mike Bennyhoff

  • Long Live The DBA

    I think the DBA role will be more critical in the coming years. I have bet my business and career on this belief. While, databases are becoming more reliable and require less technical maintenance they are becoming more complex and offer more capabilities. When SQL Report Services was first released, few products used it, now, companies are dumping Crystal Reports and using SSRS. The Forester Wave, showed in 2015 MSFT owned about 50% of the BI/reporting market. Most of this was taken from SAP Crystal Reports and yep that, now somehow falls to the DBA. Most recently MSFT has integrated R into SQL Server, I think this will be a similar pattern IBM SPSS dominates now but MSFT will gain traction in this area and again this domain will fall to the modern DBA to know and understand. You could reasonably but incorrectly argue that it was always the responsibility of the DBA to know these technologies, but from my long-lived career and my many opportunities to review job postings (see my job layoffs on my website), I started noticing DBA job listings with reporting / SSRS in about 2005. Most recently I have seen many employers forgo even listing the technologies are needed and instead, use the cryptic shorthand of SSAS / SSIS / SSRS and Tabular and data visualization. The role of the DBA has increased not decreased in terms of the technology required to perform the function successfully. Attaining a competitive advantage with complex reporting, statistical and relational technology will never fall to your ordinary business analyst or mediocre bureaucrat. The DBA role has always had these types of responsibilities. In the early days, it was keeping the database and LOBA (line of business application) running. This involved antiquated system pages and someone who was dedicated to answering. Today the need for a pure technical person to keep the system running is less intense, but other business concentrations are needed. I talk about this in my post A Growing Share Of a Shrinking Market. I do assert that decreased need for pure tech, to keep the system running is offset by the organizational risk of advocating for the right set of technologies and the increased need for business skills. In the early 90’s you could make a career out of knowing perfmon and knowing about system ram and CPU. Today you need to know all of that and how to move it to VMware or the cloud along with SSAS/ SSRS /SSIS. Today you need to know to query the data and then how and why you should surface the data. The simple ability to paint a report is not good enough DBA’s also need to know why the numbers are important to the business. If we also also stipulate there are other platforms vying a seat at the DBA/BI/Reporting/relational table. The number of ways an executive can make a mistake quickly multiples. In short, the DBA with business skills will be golden for quite some time. Final Thoughts Overall the DBA position has grown and changed. DBA’s should acquire a balance between tech, business, and analysis. I believe the DBA will continue to drive business savings and strategic advantage for many years. Mike Bennyhoff

  • Everything You Always Wanted To Know About TDE But Where Afraid To Ask

    Transparent Data Encryption (TDE) encrypts the data within the physical files of the database. If you do not possess the original encryption certificate and master key, the data cannot be read when the drive is accessed or the physical media is stolen, this is what we call encryption for data at rest. Transparent Data Encryption (TDE) is was introduced in SQL 2008 and later refined in SQL 2012; when a database is encrypted, the data files, log files, and backup files are encrypted. Once TDE is enabled for a database, the SQL instance encrypts data before it’s written to disk and then decrypts the data when read from the disk. The best part of this feature is, as its name implies, it’s completely transparent to your application. This means no application code changes (only administrative changes to enable it for a database) are required and hence no impact on the application code\functionalities when enabling TDE on a database referenced by that application. What's Great About TDE • TDE is completely transparent to the application- no changes are required on your line of business application. • TDE is enabled on the database data files and log files and subsequently backup files. This means that backups will need to be decrypted before they are moved/restored to another system. • TDE encryption will make changes to the instance level in the namely temp DB Downsides To TDE • Yes! while it sounds great to encrypt all of your data with no changes to your line of business application, you will have a performance penalty from 3% to 5% of your CPU. • You will need copies of the SQL Server certificate and private key on all SQL instances where we want to move/restore data from production

  • A Growing Share Of A Shrinking Market

    One of the areas where I have found success as contract DBA is the mundane task of backups, tuning and installation. Most of my customers have automated backup performance monitoring solutions that allow you to manage the “herd” of SQL instances rather than individual servers. The organizations that have not automated have moved to the cloud (Azure). Automation and cloud have reduced most of the responsibilities of the traditional DBA, who focused on backups, making the server run and installing new software / patching systems. I have seen many organizations rush to the cloud and lay off people the people who perform these tasks and that is good for me! Most organizations move to the cloud or automate and then are forced to justly the expense of this change by cutting too many people. The story that I usually hear, when I make a sales call, is we moved to the cloud or automated, laid off people and now have an issue. We have no one in-house DBA’s or anymore who understands the data architecture of the system, thus we need Bennyhoff Products And Services. Roughly 30% of my business is acquiring a growing share of a shrinking market; old 90’s style DBA skills. This is great for me at the present time but I consider what the IT landscape will look like in the future. I will have a solid market for basic DBA services for the next 5 or so years but I am anticipating a time when there is simply no market for this type of service. My plan is already underway to evolve with the market. The other 66% of the business falls into two other categories • Custom database driven applications • Reporting, Business Intelligence, Big Data Custom database driven applications hosted on my hardware Azure is great but it’s also expensive for small applications that have less than 10 users I can beat Azure hosting prices with my custom hardware. I can give customers a shit-ton of ram and CPU so their application outperforms Azure and performs the specified function quite nicely. Considering that 90% of the server and infrastructure as sunk costs (costs I have to pay for anyway) This, works very well for me. Target Market: Smaller organizations that need custom reporting, database design or automation. Where the user load is small. I charge for the application development and then later for the hosting. I have vertically integrated so I can offer the best service for a competitive price. Reporting, business intelligence, big data I have never met an executive who did not need to see a report changed, modified or updated in some way. I sell the ultimate group of services for reporting. I have an MBA, so I understand how the business works and why a CEO/CFO might want to see a report constructed in a specific way. I have the technical skills to create a solution, either on the customers hardware or mine.

  • Are Headhunter’s Evil?

    On my first foray into consulting, I was recruited by an agency...a head hunter. They had contracted with a customer and wanted to hire me as a Production Database Administrator for SQL Server. The recruiter asked two or three questions about my skills and wanted to know if I would interview that day. I said yes! The customer called and interviewed me and gave me the job on the spot. I was interviewed on a Friday and started that Monday. Sounds great, right? Some of the advantages of using a staffing agency / recruiter / headhunter Headhunters have access to a wider range of job opportunities: Headhunters often have access to job openings that are not advertised to the general public. Headhunters can save you time: Searching for a job can be a time-consuming process. A headhunter can do the legwork for you, identifying job opportunities that match your skills and experience and presenting them to you. Headhunters can provide valuable career advice: Headhunters have extensive experience in the job market and can provide valuable advice on how to improve your resume, negotiate salary, and prepare for job interviews. Headhunters can negotiate on your behalf: A headhunter can help you negotiate salary and other terms of employment, potentially securing a better offer for you. Headhunters can provide insight into company culture: Headhunters often have inside information on the culture and working environment of the companagency/recruiteries they represent, which can be helpful in determining whether a particular job is a good fit for you. Some of the disadvantages of using a staffing agency / recruiter / headhunter Fees: Headhunters typically charge a fee to their clients, which may be a percentage of your first-year salary or a flat fee. This can be a significant cost, especially if you are not successful in securing a job through the headhunter. Limited control: When you work with a headhunter, you are relinquishing some control over the job search process. The headhunter will present job opportunities to you, but you may not have the opportunity to actively search for and apply to jobs on your own. Limited options: A headhunter may not present you with every available job opportunity that matches your skills and experience. They may only present you with job openings from companies they are currently working with. Conflicts of interest: A headhunter's primary goal is to place candidates with their clients and earn their fee. This means that their interests may not always align with your own, and they may not be as focused on finding the best possible job for you. Competition: Headhunters often work with a large number of clients, which means that you may be competing with other job seekers for the same opportunities

  • SQL Server, Statistics

    In SQL Server, statistics are used by the query optimizer to determine the most efficient way to execute a query. Statistics are created on columns in a table or indexed view and include information about the distribution of the data in the column, such as the minimum, maximum, and average values. The query optimizer uses this information to estimate the number of rows that will be returned by a query and to select the most efficient execution plan. To check if the statistics are current, you can use the DBCC SHOW_STATISTICS command. This command displays the header, histogram, and density information for the statistics of a table or indexed view. The header information includes the name of the statistics, the date and time when the statistics were last updated, and the number of rows in the table or indexed view. The histogram information includes a list of steps, or ranges, that represent the distribution of the data in the column. The density information is a measure of the uniqueness of the data in the column. To ensure that the statistics are current, you can use the UPDATE STATISTICS command to update the statistics on a table or indexed view. You can also set the AUTO_UPDATE_STATISTICS option to ON to have SQL Server automatically update the statistics when it determines that they are out of date. It's generally a good idea to keep the statistics up to date in order to ensure that the query optimizer has accurate information about the data and can generate the most efficient execution plans. One benefit of having AUTO_UPDATE_STATISTICS set to ON is that it can help to improve query performance by ensuring that the optimizer has accurate statistics. Another benefit is that it can help to prevent the use of outdated statistics, which can lead to poor query performance. However, there are also some potential drawbacks to having AUTO_UPDATE_STATISTICS set to ON. One potential drawback is that updating statistics can be resource-intensive, particularly on large tables. This can lead to increased CPU and I/O usage, which can impact the performance of other queries that are running on the server. In addition, updating statistics too frequently can also be resource-intensive, so it's important to consider the trade-off between the benefits of having up-to-date statistics and the potential performance impact of updating them. The UPDATE STATISTICS command is used to update the statistics on a table or indexed view in SQL Server. The basic syntax of the command has not changed significantly across different versions of SQL Server, but there have been some changes to the options that are available and the default behavior of the command. In SQL Server 2005 and later, the UPDATE STATISTICS command includes an option to specify the SAMPLE size, which determines the percentage of rows used to update the statistics. The default SAMPLE size is 10%, but you can specify a different percentage if needed. In SQL Server 2008 and later, the UPDATE STATISTICS command includes an option to specify the RESAMPLE option, which causes the statistics to be updated using a new sample of the data. This can be useful if the data distribution has changed significantly since the last time the statistics were updated. In SQL Server 2012 and later, the UPDATE STATISTICS command includes an option to specify the FULLSCAN option, which causes the statistics to be updated using a full scan of the data. This can be useful if the data distribution is not representative of the entire table and a full scan will provide more accurate statistics. In SQL Server 2016 and later, the UPDATE STATISTICS command includes an option to specify the INCREMENTAL option, which causes the statistics to be updated using only the data that has been added or modified since the last time the statistics were updated. This can be more efficient than updating the statistics using a full scan or sample of the data, particularly for large tables. In summary, the UPDATE STATISTICS command has remained relatively consistent across different versions of SQL Server, but there have been some changes to the options that are available and the default behavior of the command. How often should I update statistics? There is no one-size-fits-all answer to this question, as the frequency with which statistics should be updated depends on various factors such as the size of the table, the volume of data changes, and the performance requirements of the system. In general, it's a good idea to update statistics periodically to ensure that the query optimizer has accurate information about the data distribution and can generate efficient execution plans. However, updating statistics can be resource-intensive, particularly on large tables, so it's important to balance the need for up-to-date statistics with the impact on performance. There are a few approaches you can take to determine the optimal frequency for updating statistics: Monitor query performance: If you notice that query performance is degrading over time, it may be a good indication that the statistics are out of date and need to be updated. You can use the DBCC SHOW_STATISTICS command to check the header information and see when the statistics were last updated. Monitor data changes: If the data in the table is frequently changing, it may be a good idea to update the statistics more frequently to ensure that the optimizer has accurate information. You can use the sys.dm_db_stats_properties DMV to check the modification_counter column, which indicates the number of data changes since the last time the statistics were updated. Test different update frequencies: You can experiment with different update frequencies to determine the optimal frequency for your specific environment. You can use the sys.dm_db_stats_properties DMV to check the rows column, which indicates the number of rows in the table, and the modification_counter column, which indicates the number of data changes since the last time the statistics were updated. This can help you determine the rate at which the statistics become outdated and how frequently you should update them to maintain good performance. It's important to note that the optimal frequency for updating statistics may vary depending on your specific environment and workload, so it's a good idea to monitor query performance and data changes to determine the best frequency for your system. Here is the T-SQL code to update stats on every table in every user database. using the UPDATE STATISTICS command. Summary And Recomendations From Mike Do Not set AUTO_UPDATE_STATISTICS option to ON -UNLESS- you have a working SQL job that is managing your statistics. Update statistics daily, unless you have a good reason not to Cheers Mike B

  • A Few Things To Ask During An Initial Recruiter Call

    For anyone who has known me longer than 5 minutes, you know that I love being an independent contractor. I provide database and analysis services for such technologies as SQL Server, MSFT Azure and Crystal Reports. However, many new independent contractors fall into trouble when dealing with recruiters. I write this post so you can avoid some of the mistake that I have made over my career of 20 years. Things to ask during an initial recruiter call Who is the client? Recruiters are generally cagey about giving up this information. They rightfully are worried that you will take this information and bid the work with someone else. However, this question give you can insight to their relationship with the end client. If they have a strong personal relationship or a contractual agreement with the end customer, they are likely to give this out. If they have no relationship, and they are bidding from a public request for services they are not likely to give this information out. What is your relationship with the client? This question will gives you a sense where this person is in the hierarchy of the recruitment firm. For example, if they have no relationship with the client and say their supervisor deals directly with the client, they are likely an underling, who is grinding away for the company rainmaker…..the person what you will need to speak with to close the deal. What are the qualifications the client is looking for? The recruiter has list of the skills that the perfect candidate has. Remember recruiters shop for talent based on their clients request not necessarily charisma. Recruiters are looking for someone who as x number of years with skill A B and C. Where a regular employer looks for a mix of skills and a good fir fit with their organization. Best Luck Mike Bennyhoff CEO Bennyhoff Products And Services

  • BI Trends 2022-23

    Most organizations are still adapting to the changed business requirements brought about by the pandemic. Even though the situation currently seems less severe and there are long-term changes towards a new normal, day-to-day business isn’t settled yet. Some organizations are grappling with a decline in orders, while others are struggling with supply chain disturbances or are still adapting their operations to the changed requirements. A recent study established that organizations are still working on their data foundations and are working to place themselves in the long term. Businesses are addressing the primary causes of their challenges and are working towards establishing a holistic data-driven culture. Companies are watching emerging tech trends to discover the latest trends that’ll give them an edge over their competitors. This article covers five emerging trends in BI that are driving its implementation. What Are The Top 5 Emerging Trends In BI? Whether you’re planning to install business intelligence tools or already have, understanding emerging trends in BI is essential. Monitoring trends can assist you in making the most out of them and navigating BI digitalization’s disruptive forces. Here are five emerging trends that are shaping the BI industry’s future: 1. Automated Machine Learning Also referred to as AutoML, automated machine learning is a tool that allows business analysts who don’t have a strong machine learning background to build machine learning models to solve business problems. This technology gives business analysts robust ML and AI problem-solving features without needing experience. More businesses are embracing AutoML tools as they can handle the heavy information lifting required to get to the core of performance. Most of these companies have integrated AutoML into Power BI and Microsoft Azure, enabling them to use these advanced tools. ML algorithms can help identify factors limiting your brand’s health. AutoML identifies the underlying currents limiting growth. In an era of increasing automation, it only makes sense for a business to leverage the advantages of AutoML. Partnering with a BI service provider like Bennyhoff Products and Services (BPS) can help you Integrate AutoML software into your business intelligence tools to help you remain competitive. The software will help you make sense of your company data and transform how BI is shared across your departments, thus optimizing data-driven decision-making. Advanced analytic techniques have risen in popularity when developing business solutions; with AutoML, businesses can make the most of these capabilities with little ML experience. 2. Embedded BI Applications Embedded BI refers to integrating data visualizations, dashboards, and reports inside an application. A BI platform usually manages and displays the data placed directly in the app’s user interface to enhance decision-making and data usability. The embedded industry sector is experiencing substantial demand from large, medium, and small enterprises primarily due to data analysis, reporting, management, and visualization offerings. The increased mobile BI adoption with cloud computing tech has also contributed to the embedded sector’s growth. The emergence of data-driven businesses has also contributed to the increased use of embedded applications. As a business, you must create a collaborative outside-in approach to innovation by opening your analytics to your customers, partners, and the broader ecosystem. You should embed analytics by inserting dashboards into a workflow and alerts to micro insights that can lead to enhanced decision-making. Embedded BI applications provide businesses with a modern way to present data, ultimately boosting satisfaction and user engagement. 3. Data Security Data security refers to protecting company data and preventing data loss via unauthorized access. This includes protecting data from attacks that can destroy or encrypt data or that can corrupt or modify data. Data security also entails ensuring that information is available to authorized individuals. Data security is a primary concern for most companies in the digital era, as many cyber attackers are looking for an opportunity to strike. There’s always a chance of a successful cyber attack if a company hasn’t implemented effective security measures. Since consumers know the value of personal data, they work to mend all security loopholes. As a business, the more you share APIs and data and embed trigger actions and analytics, the more you need to protect against failures. BPS can help you use Power BI methods like sensitivity labels and row-level security. Row-level security restricts access to data based on a group of people, while with sensitivity labels, an owner can apply a label on reports that define a report’s sensitivity. When implemented well, solid data security approaches will protect a company’s data against cyberattacks and against other threats and human errors, which are the main causes of data breaches. 4. Data Discovery/Visualization Data visualization refers to data representation through graphics like animations, infographics, plots, and charts. In contrast, data discovery refers to locating and identifying regulated or sensitive data to protect or remove it securely. Business agility is the hallmark of a successful enterprise, and data discovery is part of its foundation. Data discovery gives companies an overview of their operations so they can understand and address any challenges they face. Data discovery has risen in popularity as companies have started treating data as an asset and the data they collect from their operations and customers has the potential to give them a competitive edge. Data discovery allows companies to turn BI into a competitive advantage, whether in efficiency gains, customer experience, or product innovation. As a business, you should leverage data discovery to solve and identify business challenges. By assessing data, you can find patterns and trends that can assist you in improving your operations, services, and products. You can also use insights like market trends and customer behavior to make better decisions. Data discovery can help identify correlations, outliers, and trends and create reports and visualizations that companies can use to communicate findings to stakeholders. 5. SaaS and Cloud Analytics The most basic cloud computing site, SaaS, is hosted, web-based or on-demand software. A vendor supplies the same site as-is to various businesses, and the vendor is responsible for segmenting users’ data and upgrading and maintaining the site. Cloud analytics is a delivery and service model for hosting that deals with the computation or analysis of business information using cloud technologies. Most businesses opt for SaaS and cloud-based BI technologies as they offer them the potential for reduced costs, increased flexibility, and faster deployments compared to conventional on-premise BI software. To increase your competitive edge, consider working with BPS to implement SaaS and Cloud-based BI for timely and accurate forecasting and real-time data assessment. Cloud-based solutions are more sophisticated in handling security and enable safe information transfer from various sources without compromising data security. What Is The Future Of BI? The BI sector has come a long way since its inception in the 19th century, where data-driven decisions are made using BI technology rather than hunches. Businesses use BI to understand customer behavior and enhance their bottom line. The next generation of the BI industry promises to be conversational, customizable, approachable, and accessible. BI’s future is proactive; companies will have data before asking for it and revealing insights you never knew you needed. Future BI will empower everyone in the company to understand and harness the power of information to make business decisions ethically and intelligently. Businesses should develop systems to help them adapt to the changes to confront the ever-changing BI landscape. Contact BPS for tools like Power BI to increase efficiency and enable everyone in your company to make data-driven decisions.

  • Data Modeling With Navicat

    Modeling the data shows the relationships between tables. Bennyhoff Products and Services frequently uses Navicat to model Datawarehouse’s and complex reports. This product costs either $22 per month on a subscription bases or $367 on a one-time purchase. There is a free or express version that is available. The express version is limited but allows for clients to view and approve my work. I usually design a model for a client and send them the Navicat file and a link to the free version. From The Navicat Website: Navicat Data Modeler is a powerful and cost-effective database design tool which helps you build high-quality conceptual, logical and physical data models. It allows you to visually design database structures, perform reverse/forward engineering processes, import models from ODBC data sources, generate complex SQL/DDL, print models to files, and much more. Simplify the task of creating complex entity relationship models and generate the script SQL with a simple click. Navicat Data Modeler supports various database systems, including MySQL, MariaDB, Oracle, SQL Server, PostgreSQL, and SQLite. https://www.navicat.com/en/products/navicat-data-modeler

Contact Me

1825 Bevery Way Sacramento CA 95818

Tel. 916-303-3627

bottom of page