SQL Server 2016 RTM

Some major enhancements and a convergence on features from Azure to on-premise. The ‘R’ capabilities and seamless integration with big data are game-changers.



Posted in Uncategorized | Leave a comment

When two-factor authentication really isn’t

My hope is that by now all companies are using multifactor (MFA for short) authentication to protect all critical assets. MFA authentication simply means that a user must validate their identity through at least two independent mechanisms. Two-factor authentication or 2FA for short just means two independent mechanisms are utilized. However many companies that are implementing 2FA are not implementing true 2FA because the mechanisms are not totally independent. Unless there is no way that one of the mechanisms in a 2FA scheme can be leveraged to gain access to the other mechanism, the mechanisms cannot be considered independent.

Two common 2FA schemes that are subject to compromise because of lack of independence are the use of a dial-back number that goes to a smart phone and hosting a 2FA on a corporate laptop. This post focuses on the risk with a smart phone that is on the receiving end of a dial-back number.

Here is a typically sequence of events for 2FA involving a dial-back number:

1) Logon to a web site or remote desktop or some other resource

2) User is prompted for credentials

3) User enters credentials

4) Upon successful entry of the credentials, the system prompts for an access code that is communicated on the dial-back number.

5) User then receives a phone or in the case of a cell, perhaps a text message or there may be an app configured to acknowledge the call.

6) At this point, the authentication generally occurs in either of the below fashions:

a. User enters the code communicated via text message or voice call to the primary authentication interface. This is commonly used for web site scenarios. For example, my GoDaddy account is setup in that manner.

b. User answers the phone or runs an app which causes a notification to be sent automatically back to the application. This scenario is utilized commonly for remote desktop scenarios such as the solution provided by Duo security.

So, what can go wrong with the dial-back scenario when it involves a smart phone – A lot:

Suppose the phone doesn’t have an auto-lock device on the phone or has a long interval for the lock to kick in (i.e. 1 minute), and also hosts Email. Consider what happens if a person steals the cell phone.

1) The user of the phone may not immediately realize the cell phone is gone or may be slow to report it.

2) The thief is able to access the phone before the auto-lock kicks in.

3) Even if the auto-lock does kick in:

4) The notification default on the iPhone for example seems to show notifications even in lock mode. So, in that case, the thief can actually see the notification code. Worse, yet, the thief can do a “forget password” if the site does not utilize a security question and then receive an Email with a new password or a link to reset the password which also displays on the screen.

The bottom line for a smart phone scenario for 2FA with a smart phone is that the user needs to configure the phone to be secure, particularly if the user’s Email is on the phone. In reality, a smart cell phone is not a true secondary authentication method for 2FA if it is has Email on it unless it is very tightly secured through the use of biometric method such as a thumb print along with a very short lockout period. A land line or a dumb cell phone provides a much higher level of security in that there is no Email or web access that the thief can leverage to determine the password for the logon authentication. Organizations that are relying on a dial-back number may wish to rethink allowing a cell phone to be used for 2FA or at least utilize group policy to lock down the phone and never send passwords over Email for a forgotten password. Another approach is to utilize a completely separate device such as a token-oriented or smart-card security device that does not have other application capabilities.

In my next post, I will describe the risks with 2FA associated with a corporate laptop, where the token is sent to some application installed for a VPN connection. This is potentially almost as bad as relying on a cell phone.

Note – If you do not have multifactor authentication implemented or have the type of risk discussed in this article, please contact us at sales@authintel.com and we can help you get robust multifactor authentication implemented quickly and inexpensively. Controlling any access with just a username and password or any other single mechanism is a huge risk.

Posted in 2FA, data security, Mulifactor Authentication, Security | Tagged | Leave a comment

Multiple divergences for S&P analyzed (Corrected)

Note: My first version of this had an error in the query which has been corrected and the difference between up and down days is not significant. However, the larger the VIX change, the more likely to be a VIX up day and S&P down day the next day.

Today was an interesting day in the stock market. The S&P 500 closed positive while the most correlated indicators aside from other equity indexes closed strongly negative.  This includes Treasuries (up) and high yield/corporate bond (down). These are all the opposite results of the norm, at least for the last several years. Along with that volatility was down significantly, which does correlate with upwards moves. Since this seems to make the move in the S&P suspect in terms of at least the immediate term, I ran a query in my equity history database. I used the cube with the grouping function to get a rollup of each grouping and the entire selection. 

Below is the query:


Although I have history of the S&P back to 1950, some of the other instruments only go back to 1990, so the analysis is limited to then. 195 instances were found that met the condition over a 25 year period, so this only has only happened on average about 13 times per year. 

Results for Days where VIX is down while High Yield is down, Treasuries are up, but S&P 500 is up:


Out of 195 instances, VIX was up the next day slightly less often (93 versus 102 instances) test.  Along with that, the average change on the days that the VIX went up was negatively much stronger than on the VIX down days. The up days occur most often during tumultuous markets such as 2000 and 2008. Below are the detailed results.

Note that if I decrease the change percent to –7% which represents the percent change in the VIX today, the likelihood of a down day is higher (15 to 12), but with such a small number of instances, it is hard to make a case for trading based on that.

More interestingly would be to look at technical indicators such as MACD, RSI, etc. associated with the instruments and examine how these interact over longer periods in this type of configuration, but that takes more complex queries.

The usual disclaimer applies: I.e. I am not a professional investment advisor, short-selling and options involves significant risks including complete and total financial ruin, talk to your personal financial counselor, etc, etc, etc.


Posted in SQL Tips and Techniques, Stock Market | Tagged , , | Leave a comment

History may not repeat, but…

Somebody from a private forum I belong to known as “T-Theory” posted a graph that shows the rationale for being long the S&P when the 10 week exponential moving average is trending above the 50 week exponential moving average and being short in the opposite case. T-Theory is an approach invented by the late Terry Landry to viewing market behaviors in terms of cycles – that is markets tend to spend half of their time rising at a faster pace than the other half of the time and that these periods tend to occur symmetrically. For a more detailed explanation of T-Theory, see http://cdn3.traderslaboratory.com/forums/attachments/34/31965d1350013176-beyond-taylor-a1997introttheory_.pdf

The below graph shows the exponential moving averages (EMA) for 10 weeks using the red line and for 50 weeks using the green line. A sell signal is generated when the red line crosses the green line while a buy signal occurs when the green line crosses over the red line. A sell signal was recently generated for the S&P 500 and also exists for the other major indexes including the Dow. A sell signal has also been in place for many foreign indexes for many months including the Chinese market.


Clearly, this was a good strategy since 2000. I decided to quantify the benefits since then as well as over the longer-term using my equities database. I recently added .NET SQLCLR (A mechanism for Microsoft SQL Server that allows one to write .NET code and integrate into database functions) functionality to my SQL Server database that makes it relatively easy to calculate different technical indicators on the fly. I have been able to build a library of technical functions that are available from within the database. It is easy to find samples of C# code for calculating technical indicators such as the EMA that can then just be plugged into the CLR. I will write a more detailed post that includes samples of the CLR code and implementation process if that is of interest.

Below is the query to calculate the average return per week with being on the “right” side of the trade from the picture. I used a temporary table because executing the CLR function inline and joining is slower than storing the results into a temp table and then joining for some reason.


CrossOver AvgWeeklyChg (S&P 500 index since 2000)
D -0.177947
U 0.206229

The results show a significant benefit since 2000 avoiding a weekly loss of almost -0.2% during downtrends and achieving over 0.2% gain on the uptrend.

But, what about the longer-term? For this, I regenerated the temporary table using the Dow from 1928. This also shows a definitive benefit to being on the right side of the trade. In fact the performance on the upper cross-over is 8 times better per week and the performance on the down cross over is negative by -0.026 %. While -0.026 seems small, that is significant considering the Dow itself went from 240 to over 17,500 (it has lost over 500 points already since it’s EMA crossover trend change). And this is only the weekly loss, not the cumulative loss from when the trend changes. I will quantify the total amount of changes on average between crossover changes in a later post, but just looking at the crashes of 1929, 2000, 2008, etc. this has often historically amounted to over 50%

CrossOver  AvgWeeklyChg (Dow since 1928)
D -0.026041%
U 0.197229%

Is past history guaranteed to repeat? No, but when something has worked for almost 90 years it gives cause to pause.

Even though history does not always repeat in the same fashion, it does rhyme – that is cycles tend to repeat. The conditions associated with 2001 and 2008 are different from today but unfortunately bear enough similarities (i.e. high valuations like 2000-2001, high debt and slow growth like 2007-2008) to make a case that a negative cycle in equities is unfolding again. I’ve already documented this through some other postings related to not only the long-term technical indicators but fundamental indicators such as the ratio of GDP to market cap, Schiller’s 10 year Cyclically Adjusted Price Earnings (CAPE), wealth growth versus growth of stock prices, etc.

Psychologically, it is difficult to change one’s mindset often until too late. I”m afraid many will jump into equities more and more aggressively on the dips rather than take risks off the table at high points. The type of trend that has started could easily lead to further drops of over 50% from current values based on a reversion to mean. At some point, capitulation occurs where people sell out of the market at the point where it actually has bottomed and don’t resume buying until the indexes have once again risen too much.

A common answer to this corundum it that it is impossible to time the market and so one may as well just buy and hold as it always comes back. But, how long can that take? The stock market peak of 1929 took nearly 30 years to recover and Japan is still only about 50% back from the peak of 1989.  Will a large decline in equities in the US take that long to recover from? That seems unlikely based on history, but the possibility that recovery to the recent highs may take a decade or more has been proposed by some market strategists who have predicted prior bear markets such as John Hussman (http://www.hussmanfunds.com/wmc/wmc150921.htm).

Execution of the EMA crossover strategy does mean that one would not sell at the best possible point or buy at the lowest point and would miss out on a perfect strategy, but the results are far better than simply buying and holding. It seems to me wiser to follow a model that has historically proven most likely to succeed over the long-term when making investment decisions than to just hold and hope for the best rather than plow into a market that from a long-term technical perspective is now likely in decline.

Posted in Stock Market, Technical Analysis | Tagged , | 2 Comments

A couple of factoids on data security to think about

Here’s factoid that most people, even many at the IT management level don’t realize:

A 128 GB Thumb drive which can be had for under $40.00 can store enough information to accomplish identity theft for the population of the entire world (7 billion people). 128 GB is approximately 137 billion bytes which is 19 bytes per person. Name and address data can normally be compressed by a factor of 3 and birth dates and social security numbers or other national identities only use 6 bytes in packed format. So, figuring about 40 bytes for name and address compressed to 13 bytes, the total amount compresses to 19 bytes per person.

Another factoid: It takes less than 90 seconds to download 10 million records containing a person’s name, address, spouse, birth date, and social security number. This is based on a cable modem connection of 50 Mb/S which equates to about 6 MB/s or 360 MB/minute. A person’s complete identity record normally is less than 50 bytes. The entire data set for 10 million people therefore is only about 500 MB (50 * 10,000,0000). That is in uncompressed format. With compression, it takes less than 30 seconds.

Unfortunately, my experience has been that the government and most companies are not making the effort of protecting data much of a priority. They throw money at it for sophisticated products that do not actually address the problem. They pay for expensive audits from companies that do not actually have the technical expertise to spot them.  The regulatory audits seldom find actual problems as they are focused on outdated security mechanisms that do not have applicability to the most common scenarios whereby data is taken from inside the network to the outside rather than somebody breaking through a firewall from the outside.

In some cases, the approaches taken actually make it easier to steal data. For example, data thieves love encryption, which only protects against actual physical data being stolen and does nothing to protect data once it arrives to the user decrypted.  Encryption allows thieves to encrypt the data they steal so that systems have no way of knowing what is leaving the network. Even worse the companies rarely even audit access to sensitive data and have no idea that there is a breach until all of the data is exposed. Many IT departments fail to implement simple controls for locking down files stored on the network, ensuring point-to-point security for service accounts, etc. We are not just talking about PII (Personally Identifiable Information) or Private Health Information (PHI) which is bad enough, but now large chunks of intellectual property (IP) are being stolen. The number of instances of data theft will only continue to multiply as organizations do not try to solve the root problems.

Our company is focused on data security from the inside-out. We will come onsite for a day for free and call out vulnerabilities. We can provide a rapid assessment over a couple of weeks that generates a score sheet which identifies the specific vulnerabilities and remedies. We have a complete tool set to automate identification and resolution of security issues at the database, applications, and file system levels where data theft originates. We are especially focused on healthcare with experience with HIPAA regulations. Most healthcare providers are not actually meeting HIPAA requirements. One of the requirements is that a record of access for all individuals who have looked at a person’s healthcare data can be produced upon demand. 

We are experts at analyzing for vulnerabilities at the database and file system level where data theft originates. By the time the data goes out the firewall, it is already encrypted and non-detectable by firewalls. The only way to stop data theft is to implement safeguards at the data and application level. This requires a unique combination of data security, database, and application development skills. We are experts at working with huge amounts of data – one of our products we developed for financial risk management has a database of over 3 billion records which supports near instantaneous queries of complex information requests.

I am one of less than 150 certified Microsoft SQL Server  masters in the world and one of less than probably 15 or 20 that also holds a top-tier ISC2 CISSP certification.  My recent PhD is in the area of automated learning whereby problems can be modeled, simulated, and used to learn heuristics for solving the problems. I have over 30 years of experience in application development. My experience includes 10 years working with classified system. My network of resources includes the top persons and companies in the world with expertise in machine learning related to data security as well as all aspects of data security including at the network level.

Do you want to do something proactive to stop data theft and have truly end-to-end security implemented to prevent inside-out theft or wait until after a breach occurs? Do you have a way to detect that a breach has even happened if the person uses trusted credentials to carry it out? Most data theft is carried out by an unauthorized person using authorized credentials and misusing them. Do your systems really detect this situation? This can only be done by implementing controls at the database and application levels. 

Contact us at sales@authintel.com if you really want to practice due diligence to prevent and stop data theft. Give us an opportunity to help you before it is too late. We host a large secure co-located environment that can provide a sandbox area where we can stage your entire IT structure as virtual machines.  Through the use of over 20 fusion-IO high speed SSD drives, we can provision virtual machines in seconds. We have an automated data obfuscation tool that includes verification that will allow you to create a realistic testing environment without risk of theft of meaningful data. Using our sandbox also helps evaluate your level of data preparedness and disaster recovery ability.

Why wait until after a breach is out to take action? Does your company really want the liability of not only having it’s data stolen, but also now meeting regulatory requirements such as those mandated by HIPAA.

Posted in Uncategorized | Leave a comment

High-performance Statistical Queries using Self-joins

In my pursuit of understanding asset markets, I’ve maintained a SQL Server database with a lot of information about the stock market and other indexes. Using some data services along with SSIS, this database has been kept current to the point that it now has over 3 billion records in total including 2.3 billion records in an intraday table. Most of my queries and interest concerns cumulative equity and index end-of-day history which is only 75 million row and the options data since 2003 which is now up to 175 million rows.

To be able to query this level of data, I utilize Fusion-io PCIE SSD storage for the SQL Server database. Using self-joins can produce some very interesting analysis. For example, the below query outlines the performance of a few global indexes where there have been large bounces close to market tops and the ensuing performance afterwards. This query complete in just a few seconds. There are couple of tricks that make this run faster – one is the storing of a relative day number to avoid performance issues with working around weekends for querying prior dated history. The day number is sequential across holidays and weekends so a direct link can be done without a range test. The other trick is that the table is partitioned based on the date which allows a good deal of parallelism.

Here is the query

select h.TradingSymbol, h.MarketDate, h.ChgPct + hprev.ChgPct as TwoDayChg, hprev.PriceAtClose as Price,
(h20.PriceAtClose – h.PriceAtClose) / h.PriceAtClose as Chg1Month,
(h60.PriceAtClose – h.PriceAtClose) / h.PriceAtClose as Chg3Month,
(h180.PriceAtClose – h.PriceAtClose) / h.PriceAtClose as Chg9Month,
(h360.PriceAtClose – h.PriceAtClose) / h.PriceAtClose as Chg18Month,
(h540.PriceAtClose – h.PriceAtClose) / h.PriceAtClose as Chng27Month,

((select MAX(hpast.PriceAtClose) from dbo.EquityHistory hpast
where hpast.TradingSymbol = h.TradingSymbol
and hpast.MarketDate < hprev.MarketDate) – hprev.PriceAtClose) / hprev.PriceAtClose
as PctFromTop

from dbo.EquityHistory h
inner join dbo.EquityHistory hprev
    on hprev.DayNumber = h.DayNumber – 1
    and hprev.TradingSymbol = h.TradingSymbol
inner join dbo.EquityHistory h20
    on h20.TradingSymbol = hprev.TradingSymbol
    and h20.DayNumber = hprev.DayNumber + 20
inner join dbo.EquityHistory h60
    on h60.TradingSymbol = hprev.TradingSymbol
    and h60.DayNumber = hprev.DayNumber + 60
inner join dbo.EquityHistory h180
    on h180.TradingSymbol = hprev.TradingSymbol
    and h180.DayNumber = hprev.DayNumber + 180
inner join dbo.EquityHistory h360
    on h360.TradingSymbol = hprev.TradingSymbol
    and h360.DayNumber = hprev.DayNumber + 360
inner join dbo.EquityHistory h540
    on h540.TradingSymbol = hprev.TradingSymbol
    and h540.DayNumber = hprev.DayNumber + 540
where h.TradingSymbol in (‘^dji’,’^dax’,’^ixic’,’^n225′,’^ftse’,’^djt’,’^gspc’,’^rut’,’^ssec’)
and h.ChgPct + hprev.ChgPct > 6.0
and  (select MAX(hpast.PriceAtClose) from dbo.EquityHistory hpast
where hpast.TradingSymbol = h.TradingSymbol
and hpast.MarketDate < hprev.MarketDate) between 1.0 * h.PriceAtClose and 1.15 * h.PriceAtClose
order by h.MarketDate

And here are the results formatted in Excel:

This is not a good omen for the stock market for the next couple of years based on history.

There is probably not enough data here to draw conclusions, but since the query is against indexes rather than individual stocks, it does seem pretty convincing. It is pretty certain that a query against the entire equity history for this would yield similar bottom line averages, but that would take several minutes to complete.

Here is the query again as it shows with the SQL markup:


Posted in Uncategorized | Leave a comment

Increased data security focus for Authintel

We’ve all seen the news about the latest data security breaches. While bureaucrats blame these on sophisticated hacks from China, the reality is that these are mainly due to negligence and are so simple that a child with basic computer knowledge could pull off many of these. http://www.darkreading.com/attacks-breaches/the-eight-most-common-causes-of-data-breaches/d/d-id/1139795?

The problem is that technology has focused on encryption and firewalls while neglecting security at the basic data and application tiers and has very little concept of proactive monitoring of actual user behaviors. Most data theft occurs due to compromised employees or stolen credentials wherein the perpetrator appears to the system as a trusted user and is not monitored. Our company holds credentials that include a PhD for automated learning, the highest-level ISC security certification, CISSP, SQL Server master, and certified .NET application developers. We are uniquely qualified to resolve the use cases that lead to security breaches at the application and data levels. We have produced a video that outlines how millions of PII data records can be stolen without trace in less than 5 minutes that will work at most companies using an ordinary user account. We are focused on resolution of the actual uses cases that lead to data theft rather than on elaborate technologies that are difficult to configure and mostly ineffective. Contact us and we can perform an audit as well as provide remediation including deployment of automated scripts and tools.

Posted in Uncategorized | Leave a comment