Sharepoint Search 2013 - is there any way to index a list of URLs stored in a database?

346 Views Asked by At

I have a database table with a list of URLs that I would like Sharepoint Search 2013 to index so they show up in search results - the URLs are a mixture of content types - web pages, Word documents, PDFs, etc.

All the URLs are internal to my network but aren't Sharepoint pages or files stored in Sharepoint.

I am using Sharepoint 2013 Enterprise Search on a Windows 2008 R2 server.

Does anyone have any ideas on how to achieve this?

I have searched for options but can't seem to find anything relevant - BDC and BCS have come up a lot but seems to be more indexing content returned by the connector. What I want to do is to use the data returned from the table as pointer to items to be indexed.

I'm very new to Sharepoint and Sharepoint Search and am at a bit of loss on how to go about this (to make it even more difficult I would like to apply ACLs to the results, and the ACLs are in another table but that's another question!). Given my experience level I would like the answer to be as basic as possible if you can, but any help would be apprecieated.

1

There are 1 best solutions below

0
Dan Gøran Lunde On

BDC and BCS is the proper way to do it, but it's very complicated. If you want something simple, create a small script that writes all the URLs to a single HTML document. Then use the web crawler to crawl this document. It will follow the links and crawl the content.