First -- I do not believe this is a duplicate issue. I've searched for same or similar problems on SO extensively, and due to the nature of troubleshooting before asking, I believe this problem is unique.

Facebook cannot grasp my og:image files and I have tried every usual solution. I'm beginning to think it might have something to do with https://...

  • I have checked http://developers.facebook.com/tools/debug and have zero warnings or errors.
  • It is finding the images we linked to in the "og:image", but they're showing up blank. When we click the image(s), however, they DO exist and it takes is straight to them.
  • It DOES show one image -- an image hosted on a non-https server.
  • We've tried square images, jpegs, pngs, larger sizes and smaller sizes. We've put the images right in public_html. Zero are showing up.
  • It's not a caching error, because when we add another og:image to the meta, FB's linter does find and read that. It DOES show a preview. The preview is blank. The only exception we're getting is for images that are not on this website.
  • We thought maybe there was some anti-leach on cpanel or the .htaccess that was preventing the images from showing up, so we checked. There was not. We even did a quick < img src="[remote file]" > on an entirely different server and the image shows up fine.
  • We thought maybe it was the og:type or another oddity with another meta tag. We removed all of them, one at a time and checked it. No change. Just warnings.
  • The same code on a different website shows up without any issue.
  • We thought maybe it was not pulling images because we're using the same product page(s) for multiple products (changing it based on the get value, ie, "details.php?id=xxx") but it's still pulling in one image (from a different url).
  • Leaving any og:image or image_src off, FB does not find any images.

I am at the end of my rope. If I said how much time myself and others have spent on this, you'd be shocked. The issue is that this is an online store. We absolutely, positively cannot NOT have images. We have to. We have ten or so other sites... This is the only one with og:image problems. It's also the only one on https, so we thought maybe that was the problem. But we can't find any precedent anywhere on the web for that.

These are the meta-tags:

<meta property="og:title" content="[The product name]" /> 
<meta property="og:description" content="[the product description]" /> 
<meta property="og:image" content="https://www.[ourwebsite].com/images/shirts/overdriven-blues-music-tshirt-details-black.png" />
<meta property="og:image" content="https://www.[ourwebsite].com/images/shirts/overdriven-blues-music-tshirt-art-black.png" />
<meta property="og:image" content="http://www.[ADIFFERENTwebsite].com/wp-content/uploads/2011/06/ARS-Header-Shine2.png" />
<meta property="og:image" content="https://www.[ourwebsite].com/images/ARShopHeader.png" />
<meta property="og:image" content="http://www.[ourwebsite].com/overdriven-blues-music-tshirt-art-black.JPG" />
<meta property="og:type" content="product"/>
<meta property="og:url" content="https://www.[ourwebsite].com/apparel-details.php?i=10047" />
<meta property="og:site_name" content="[our site name]" />      
<meta property="fb:admins" content="[FB-USER-ID-NUMBER]"/>
<meta name="title" content="[The product name]" />
<meta name="description" content="[The product description]" />
<link rel="image_src" href="https://www.[ourwebsite].com/images/shirts/overdriven-blues-music-tshirt-details-black.png" />
<meta name="keywords" content="[four typical keywords]">
<meta name="robots" content="noarchive">

In case you want it, here's a link to one of our product pages that we've been working on. [Link shortened to try to curb this getting into search results for our site]: http://rockn.ro/114

EDIT ----

Using the "see what facebook sees" scraper tool, we were able to see the following:

"image": [          
      {
         "url": "https://www.[httpSwebsite].com/images/shirts/soul-man-soul-music-tshirt-details-safari.png"
      },
      {
         "url": "https://www.[httpSwebsite].com/images/shirts/soul-man-soul-music-tshirt-art-safari.png"
      },
      {
         "url": "http://www.[theotherNONSECUREwebsite].com/wp-content/uploads/2011/06/ARS-Header-Shine2.png"
      }
   ],

We tested all links it found for a single page. All were perfectly valid images.

EDIT 2 ----

We tried a test and added a subdomain to the NONSECURE website (from which images are actually visible through facebook). Subdomain was http://img.[nonsecuresite].com. We then put all images into the main subdomain folder and referenced those. It would not pull those images into FB. However, it would still pull any images that were referenced on the nonsecure main domain.

POSTED WORKAROUND ----

Thanks to Keegan, we now know that this is a bug in Facebook. To workaround, we placed a subdomain in a different NON-HTTPS website and dumped all images in it. We referenced the coordinating http://img.otherdomain.com/[like-image.jpg] image in og:image on each product page. We then had to go through FB Linter and run EVERY link to refresh the OG data. This worked, but the solution is a band-aid workaround, and if the https issue is fixed and we go back to using the natural https domain, FB will have cached the images from a different website, complicating matters. Hopefully this information helps to save someone else from losing 32 coding hours of their life.

有帮助吗?

解决方案

I ran into the same problem and reported it as a bug on the Facebook developer site. It seems pretty clear that og:image URIs using HTTP work just fine and URIs using HTTPS do not. They have now acknowledged that they are "looking into this."

The bug can be seen here: https://developers.facebook.com/bugs/260628274003812

其他提示

Some properties can have extra metadata attached to them. These are specified in the same way as other metadata with property and content, but the property will have extra :

The og:image property has some optional structured properties:

  • og:image:url - Identical to og:image.
  • og:image:secure_url - An alternate url to use if the webpage requires HTTPS.
  • og:image:type - A MIME type for this image.
  • og:image:width - The number of pixels wide.
  • og:image:height - The number of pixels high.

A full image example:

<meta property="og:image" content="http://example.com/ogp.jpg" />
<meta property="og:image:secure_url" content="https://secure.example.com/ogp.jpg" /> 
<meta property="og:image:type" content="image/jpeg" /> 
<meta property="og:image:width" content="400" /> 
<meta property="og:image:height" content="300" />

So you need to change og:image property for your HTTPS URLs to og:image:secure_url

Ex:

HTTPS META TAG FOR IMAGE:

<meta property="og:image:secure_url" content="https://www.[YOUR SITE].com/images/shirts/overdriven-blues-music-tshirt-details-black.png" />

HTTP META TAG FOR IMAGE:

<meta property="og:image" content="http://www.[YOUR SITE].com/images/shirts/overdriven-blues-music-tshirt-details-black.png" />

Source: http://ogp.me/#structured <-- You can visit this site for more information.

Hope this helps you.

EDIT: Don't forget to ping facebook servers after updating your codes - URL Linter

I don't know, if it's only with me but for me og:image does not work and it picks my site logo, even though facebook debugger shows the correct image.

But changing og:image to og:image:url worked for me. Hope this helps anybody else facing similar issue.

Got here from Google but this wasn't much help for me. It turned out that there is a minimum aspect ratio of 3:1 required for the logo. Mine was almost 4:1. I used Gimp to crop it to exactly 3:1 and voila - my logo is now shown on FB.

tl;dr – be patient

I ended up here because I was seeing blank images served from a https site. The problem was quite a different one though:

When content is shared for the first time, the Facebook crawler will scrape and cache the metadata from the URL shared. The crawler has to see an image at least once before it can be rendered. This means that the first person who shares a piece of content won't see a rendered image

[https://developers.facebook.com/docs/sharing/best-practices/#precaching]

While testing, it took facebook around 10 minutes to finally show the rendered image. So while I was scratching my head and throwing random og tags at facebook (and suspecting the https problem mentioned here), all I had to do was wait.

As this might really stop people from sharing your links for the first time, FB suggests two ways to circumvent this behavior: a) running the OG Debugger on all your links: the image will be cached and ready for sharing after ~10 minutes or b) specifying og:image:width and og:image:height. (Read more in the above link)

Still wondering though what takes them so long ...

I had the same error and nothing of previous have helped, so I tried to follow original documentation of Open Graph Protocol and I added prefix attribute to my html tag and everything became awesome.

<html prefix="og: http://ogp.me/ns#">

I had similar problems. I removed the property="og:image:secure_url" and now it will scrub with just og:image. Sometimes, less is more

I discovered another scenario that can cause this issue. I went through all the steps described in the question and the answers, still the problem remained.

I checked my images and found that some of my posts had way too large thumbnail images in og:image in the range of several thousand pixels and several megabytes.

This happened due to the recent migration from WP to Jekyll, I optimized my images with gulp, but used the original images in og:image by mistake.

Facebook gives us the following recommendations as of today:

Use images that are at least 1200 x 630 pixels for the best display on high resolution devices. At the minimum, you should use images that are 600 x 315 pixels to display link page posts with larger images. Images can be up to 8MB in size.

So there is an upper limit of 8MB.

As I accidentally found, transparent blank image comes with response header indicating possible cause of the problem.

  1. Go to the debugger at https://developers.facebook.com/tools/debug/og/object/
  2. Put your URL
  3. In the bottom, facebook shows your "image" (transparent 1x1 GIF)
    1. Image is linked to your original image - no point pressing it
    2. Press right and view image (you'll get something like https://external-ams3-1.xx.fbcdn.net/safe_image.php?d=...&url=...)
  4. Turn on Net tab on firebug/developer tools, refresh page if needed
  5. You'll get x-error-detail response header with explanation

For example, in my case it was Invalid image extension for URL: https://[mydomain]/[myfilename].jpg

The real issue in my case was related to prerender.io.

As it turns out, if image is requested via prerender, it's converted to HTML. Something like this:

<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN" "http://www.w3.org/TR/REC-html40/loose.dtd">
<html>
<head><meta http-equiv="content-type" content="text/html; charset=utf-8"></head>
<body style="margin: 0px;"><img style="-webkit-user-select: none; cursor: -webkit-zoom-in; " src="https://[yourdomain].com/[yourfilename].jpg" width="1078" height="718"></body>
</html>

It's either bug in prerender itself, or it's supposed to be configured in your proxy to not use prerender for *.jpg requests (even if they are requested by Facebook bot).

It's really hard to notice this, as prerender is used only on certain user-agent headers.

I ran into the same issue and then I noticed that I had a different domain for the og:url

Once I made sure that the domain was the same for og:url and og:image it worked.

Hope this helps.

Don't forget to refresh servers through :

Facebook Debugger

And click on "Collect new info"

In my case the problem was in not providing CA Root Certificate. I figured it out after using: https://www.ssllabs.com/ssltest/analyze.html to analyze SSL configuration.

Similar symptoms (Facebook et al not correctly fetching og:image and other assets over https) can occur when the site's https certificate is not fully compliant.

Your site's https cert may seem valid (green key in the browser and all), but it will not scrape correctly if it's missing an intermediate or chain certificate. This can lead to many wasted hours checking and rechecking all the various caches and meta tags.

Might not have been your problem, but could be other's with similar symptoms (like mine). There's many ways to check your cert - the one I happened to use: https://www.sslshopper.com/ssl-checker.html

In addition, this problem also occurs when you add a user generated story (where you do not use og:image). For example:

POST /me/cookbook:eat?
  recipe=http://www.example.com/recipes/pizza/&
  image[0][url]=http://www.example.com/recipes/pizza/pizza.jpg&
  image[0][user_generated]=true&
  access_token=VALID_ACCESS_TOKEN

The above will only work with http and not with https. If you use https, you will get an error that says: Attached image () failed to upload

Had a similar problem today, which the Sharing Debugger helped me solve. It seems that Facebook can’t (currently) understand images with XMP metadata embedded. When I replaced the images on our articles with versions without XMP metadata, and re-scraped the page (using the Sharing Debugger), the problem went away. A hex editor will help you see whether or not your image contains XMP metadata.

I took http:// out of my og:image and replaced it with just plain old www. then it started working fine.

You can use this tool, by Facebook to reset your image scrape cache and test what URL it is pulling for the demo image.

In my case, it seems that the crawler is just having a bug. I've tried:

  • Changing links to http only
  • Removing end white space
  • Switching back to http completely
  • Reinstalling the website
  • Installing a bunch of OG plugins (I use WordPress)
  • Suspecting the server has a weird misconfiguration that blocks the bots (because all the OG checkers are unable to fetch tags, and other requests to my sites are unstable)

None of these works. This costed me a week. And suddenly out of nowhere it seems to work again.

Here are my research, if someone meets this problem again:

Also, there are more checkers other than the Facebook's Object Debugger for you to check: OpenGraphCheck.com, Abhinay Rathore's Open Graph Tester, Iframely's Embed Codes, Card Validator | Twitter Developers.

I can see that the Debugger is retrieving 4 og:image tags from your URL.

The first image is the largest and therefore takes longest to load. Try shrink that first image down or change the order to show a smaller image first.

From what I observed, I see that when your website is public and even though the image url is https, it just works fine.

For me this worked:

<meta property="og:url" content="http://yoursiteurl" />
    <meta property="og:image" content="link_to_first_image_if_you_want" />
    <meta property="og:image" content="link_to_second_image_if_you_want" />
    <meta property="og:image:type" content="image/jpeg" /> 
    <meta property="og:image:width" content="400" /> 
    <meta property="og:image:height" content="300" />
    <meta property="og:title" content="your title" />
    <meta property="og:description"  content="your text about homepage"/> 

Once you update the meta tag make sure the content(image) link is absolute path and go here https://developers.facebook.com/tools/debug/sharing enter you site link and click on scrape again in next page

After several hours of testing and trying things...

I solved this problem as simple as possible. I notice that they use "test pages" inside Facebook Developers Page that contains only the "og" tags and some text in the body tag that referals this og tags.

So what have i done?

I created a second view in my application, containing this same things they use.

And how i know is Facebook that is accessing my page so i can change the view? They have a unique User Agent: "facebookexternalhit/1.1"

许可以下: CC-BY-SA归因
不隶属于 StackOverflow
scroll top