Category Archives: Dicas
Nos próximos dias 29 e 30 de Novembro teremos mais uma edição do 24 Horas de PASS – Edição em Português. O evento já é bastante conhecido da comunidade técnica Microsoft de Data Platform.
Como todos sabem o evento é 100% gratuito e online, então sem desculpas para não participar. Nessa edição terei o prazer de realizar uma apresentação sobre Power BI Report Server e falar um pouco mais sobre esse modelo de Power BI e como sua organização pode também obter. Então, segue abaixo os detalhes de minha apresentação. Read the rest of this entry
When you have a consolidated environment, usually you don’t want to set very broad permissions to users, most of the times you only want them to see what is really necessary.
Doing that in the Database Engine is quite easy, but what about Reporting Services? We know that SSRS exposes the following default roles: Browser, Content Manager, My Reports, Publisher and Report Builder. However, for me sometimes those roles offer too much to the user.
What people usually don’t know is that you can connect to your SSRS Instance using SQL Server Management Studio and then create customized roles. So, let me show you how to do it. Read the rest of this entry
Espero que vocês estejam bem e rodando tudo em Cloud!
Neste tempo repleto de mudanças, o evento passa por uma grande transformação a fim de proporcionar à toda a comunidade Microsoft uma nova e atual experiência: Seja bem-vindo ao MICROSOFT 365 DAY.
Para este evento reunimos os melhores profissionais da comunidade técnica e agora além do Office 365, teremos também sessões EM+S, falando de Azure AD, ATA, Cloud App Security, AIP, Intune e para fechar o ciclo, contaremos com sessões sobre Windows 10.
O MICROSOFT 365 DAY é inovador em todas as suas formas, o evento será 100% Online, com 24 sessões de 25 minutos cada e 4 sessões de Business de 50min, tudo dividido em três trilhas de conhecimento:
First of all, I want to thank everyone at my session at SQL Saturday 689 in Prague this last weekend. It was a tremendous pleasure for me to present and share a bit of what I know with all of you.
I also want to say THANK YOU to the Czech BI & SQL Pass User Group for putting this amazing event to the community, I know how much work and effort has to be done to have such a great event. You treated not only myself but all the speakers so well, so thanks again for having me! You guys are great!
I also want to say THANK YOU to the Sponsors, with all your support the event was just awesome!
As I promised during my presentation, I would write this post to share the link to where you can download most of the resources used by myself during the presentation.
I would like to also ask you a bit of your time to fill up the session evaluation. This means a lot to all the speakers. Doing that, it helps all of us speakers to improve our presentation, subjects, demos and etc.. Please, please, please!! Do it! The link is below.
Thanks again and see you in 2018!
Data Platform MVP
A algum tempo atrás estava gravando um curso sobre criação de baselines com SQL Server, Integration Services e Reporting Services. Acontece que com a minha mudança de rotina, de trabalho e de uma série de outros fatores a gravação do curso acabou ficando em segundo plano.
Depois de pensar mais sobre isso, decidi que não iria mais continuar a gravação dos vídeos e então resolvi liberar em meu canal do youtube os primeiros que eu tinha gravado. São cerca de 4 a 5 vídeos curtos, mas que passam alguma dicas bem legais. Read the rest of this entry
PS: Portuguese version below!
Before even starting this blog post, I want it to make it very clear! If you are an Python/Developer Expert most likely this isn’t for you.
The reason for that is if you aren this kind of person, maybe you will find that those courses that I am sharing in here are too basic, because the main idea is really that. I want to share the courses that I did, or that I am still doing and reviewing from time to time, in order to not forget what I learnt.
This blog post is meant for people like me that are maybe SQL Server professionals and want to learn more about this language and what can be done. So, without prolonging too much, here it is my list. Read the rest of this entry
I’m not sure if you guys know it, but on October 07th, 2017 we will have another amazing SQL Saturday 656 that it is taking place in Copenhagen, Denmark.
Surely, it will be an amazing day of learning and network from the Data Platform Experts that are coming from all places. So, if you are an IT person and is not quite sure yet on what do to on Oct, 7th, join this FREE event. You can get all the details from the home page.
Two things that are important to mention! Read the rest of this entry
I just wanted to do this quick blog post to give you a tip about a recent problem that I had in a SSIS Project. My scenario is the following.
Source Database Server: PostgreSQL
Target Database Server: SQL Server 2016
As you can see, I need to extract data from this PostgreSQL database and import in SQL Server. So far everything is working, but today I got this error of in one particular table: [OLE DB Destination ] Error: An error occurred while setting up a binding for the “MyColumnName” column. The binding status was “DT_NTEXT”. The data flow column type is “DBBINDSTATUS_UNSUPPORTEDCONVERSION”. The conversion from the OLE DB type of “DBTYPE_IUNKNOWN” to the destination column type of “DBTYPE_WVARCHAR” might not be supported by this provider. Read the rest of this entry
I wanted to share something that I am doing this week at work. One of the applications that I support is generating a lot of transaction log in the database. Just to have an idea, we run Transaction Log backups every 30 minutes and I have backups over 100GB and sometimes 200GB. The difficult part of investigating what is generating that amount of log is because the databases involved are used by at least 3 different applications and streams. So, I remember that in the past I created a simple script that looks at this information in the database and then I may have more inputs of what is really generating that amount of log. Read the rest of this entry