To help with the BCS example, I first create an SQL View using the following SQL.
SELECT
p.ProductKey,
p.ProductName,
p.ProductDescription,
p.Manufacturer,
p.BrandName,
p.ClassName,
p.ColorName,
p.UnitCost,
p.UnitPrice,
psc.ProductSubcategoryName,
pc.ProductCategoryDescription
FROM Contoso_Retail_DW.dbo.DimProduct p
LEFT OUTER JOIN Contoso_Retail_DW.dbo.DimProductSubcategory psc
on p.ProductSubcategoryKey = psc.ProductSubcategoryKey
LEFT OUTER JOIN Contoso_Retail_DW.dbo.DimProductCategory pc
on psc.ProductCategoryKey = pc.ProductCategoryKey
Next we fire up SharePoint Designer to create the External Content Type.
Click the External Content Type ribbon action.
Click the Click here to discover external data sources and define operations.
Select SQL Server as the data source.
Enter the connection info
Right click on SearchView and select New Read Item Operation
Enter details about the operation
Next
Then Finish
Now create a New Read List Operation
Enter operation details
Next, note the warning regarding specifying a limit filter.
Finish
Now click Summary View in the Ribbon, select ProductName from the Fields List and click the Set As Title ribbon action.
Now save your external content type
Now if we browse to our site collection, there's a Contoso Product list and we can see the BCS in action!
The BCS option has a nice feature that the JDBC connector didn't have - Profile Pages.
You can create a site to host profile pages, and then in your Business Connectivity Services Service Application, you can create Profile pages for your external content type
If I browse to the profile page, I get this error.
That's a good error :) because I haven't specified a product id as a query string parameter the error tells me it can't find the item, that suggests that its at least querying the DB.
Now navigate to your FAST Content Search Service Application
Create a new Content Source and specify Line of business data as the content source type, and select Contoso_Retial_DW
Now start a crawl! When a crawl is running you can check if it's working in the crawl log.
If crawls fail to work you can get a useful message through the crawl log
I also suggest you fire up ULSViewer to see what the cause is. ULSViewer is THE BEST TOOL for troubleshooting BCS and FAST Crawling using BCS.
Now that it looks like our crawl has successfully finished, lets try a test search
I get two results, the first is from the BCS and the second item is from the JDBC connector. I wonder why the teaser description is different between the two? The BCS also has a different icon due to search resolving the JDBC item target as a folder.
Thanks to profile pages, I can actually click the BCS result which takes me to the profile page for the product.
This is one area which would require a bit of extra work and development if you go with the JDBC connector approach.
Finally, we want to map some search properties. The managed properties from the previous demo are still setup, so I'll map the new BCS crawl properties to these same managed properties.
Now when you are in the FAST Query Search Service Application, don't click on the Metadata properties link in the left menu, 'these aren't the search properties you're looking for' /waveshand.
If you mess with property mappings in there you won't see your BCS properties, instead click on the FAST Search Administration link.
Then click on the Managed Properties link.
Click on Crawled property categories
You should notice the Business Data category with a larger number of properties, click the Business Data link.
There's a list of our crawl properties!
You can map crawled properties through to managed properties through here too.
After mapping all the crawled properties to managed properties, perform a full crawl.
Now retest the same search
Note the refiner counts have all incremented compared to the previous test search, so the property mappings and refiners are now setup, it's as easy as that!
Thanks for reading! I've got one more post to go for this series, I'll perform a closer comparison of the pros and cons of the the JDBC Connector and BCS Connector and why you might use either depending on your requirements and priorities.
3 comments:
Do you know if it's possible to follow the same approach if you're dealing with a MySQL DB?
Looking forward to Part 4!! Great post!
The one requirement I have not been able to solve with FAST ESP and BCS is security trimming. Can security trimming be done through BCS by surfacing ACLs at crawl time?
Or does this require writing a custom connector? If so, do you have any resources or pointers for this?
Post a Comment